Jan 22 04:00:23 np0005591762 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 22 04:00:23 np0005591762 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 22 04:00:23 np0005591762 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 04:00:23 np0005591762 kernel: BIOS-provided physical RAM map:
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 22 04:00:23 np0005591762 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000027fffffff] usable
Jan 22 04:00:23 np0005591762 kernel: NX (Execute Disable) protection: active
Jan 22 04:00:23 np0005591762 kernel: APIC: Static calls initialized
Jan 22 04:00:23 np0005591762 kernel: SMBIOS 2.8 present.
Jan 22 04:00:23 np0005591762 kernel: DMI: Red Hat OpenStack Compute/RHEL, BIOS 1.16.1-1.el9 04/01/2014
Jan 22 04:00:23 np0005591762 kernel: Hypervisor detected: KVM
Jan 22 04:00:23 np0005591762 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 22 04:00:23 np0005591762 kernel: kvm-clock: using sched offset of 3331316061 cycles
Jan 22 04:00:23 np0005591762 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 22 04:00:23 np0005591762 kernel: tsc: Detected 2445.404 MHz processor
Jan 22 04:00:23 np0005591762 kernel: last_pfn = 0x280000 max_arch_pfn = 0x400000000
Jan 22 04:00:23 np0005591762 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 22 04:00:23 np0005591762 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 22 04:00:23 np0005591762 kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000
Jan 22 04:00:23 np0005591762 kernel: found SMP MP-table at [mem 0x000f5b60-0x000f5b6f]
Jan 22 04:00:23 np0005591762 kernel: Using GB pages for direct mapping
Jan 22 04:00:23 np0005591762 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 22 04:00:23 np0005591762 kernel: ACPI: Early table checksum verification disabled
Jan 22 04:00:23 np0005591762 kernel: ACPI: RSDP 0x00000000000F5B20 000014 (v00 BOCHS )
Jan 22 04:00:23 np0005591762 kernel: ACPI: RSDT 0x000000007FFE35EB 000034 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 04:00:23 np0005591762 kernel: ACPI: FACP 0x000000007FFE3403 0000F4 (v03 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 04:00:23 np0005591762 kernel: ACPI: DSDT 0x000000007FFDFCC0 003743 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 04:00:23 np0005591762 kernel: ACPI: FACS 0x000000007FFDFC80 000040
Jan 22 04:00:23 np0005591762 kernel: ACPI: APIC 0x000000007FFE34F7 000090 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 04:00:23 np0005591762 kernel: ACPI: MCFG 0x000000007FFE3587 00003C (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 04:00:23 np0005591762 kernel: ACPI: WAET 0x000000007FFE35C3 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 04:00:23 np0005591762 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe3403-0x7ffe34f6]
Jan 22 04:00:23 np0005591762 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfcc0-0x7ffe3402]
Jan 22 04:00:23 np0005591762 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfc80-0x7ffdfcbf]
Jan 22 04:00:23 np0005591762 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe34f7-0x7ffe3586]
Jan 22 04:00:23 np0005591762 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe3587-0x7ffe35c2]
Jan 22 04:00:23 np0005591762 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe35c3-0x7ffe35ea]
Jan 22 04:00:23 np0005591762 kernel: No NUMA configuration found
Jan 22 04:00:23 np0005591762 kernel: Faking a node at [mem 0x0000000000000000-0x000000027fffffff]
Jan 22 04:00:23 np0005591762 kernel: NODE_DATA(0) allocated [mem 0x27ffd5000-0x27fffffff]
Jan 22 04:00:23 np0005591762 kernel: crashkernel reserved: 0x000000006f000000 - 0x000000007f000000 (256 MB)
Jan 22 04:00:23 np0005591762 kernel: Zone ranges:
Jan 22 04:00:23 np0005591762 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 22 04:00:23 np0005591762 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 22 04:00:23 np0005591762 kernel:  Normal   [mem 0x0000000100000000-0x000000027fffffff]
Jan 22 04:00:23 np0005591762 kernel:  Device   empty
Jan 22 04:00:23 np0005591762 kernel: Movable zone start for each node
Jan 22 04:00:23 np0005591762 kernel: Early memory node ranges
Jan 22 04:00:23 np0005591762 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 22 04:00:23 np0005591762 kernel:  node   0: [mem 0x0000000000100000-0x000000007ffdafff]
Jan 22 04:00:23 np0005591762 kernel:  node   0: [mem 0x0000000100000000-0x000000027fffffff]
Jan 22 04:00:23 np0005591762 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000027fffffff]
Jan 22 04:00:23 np0005591762 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 22 04:00:23 np0005591762 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 22 04:00:23 np0005591762 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 22 04:00:23 np0005591762 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 22 04:00:23 np0005591762 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 22 04:00:23 np0005591762 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 22 04:00:23 np0005591762 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 22 04:00:23 np0005591762 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 22 04:00:23 np0005591762 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 22 04:00:23 np0005591762 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 22 04:00:23 np0005591762 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 22 04:00:23 np0005591762 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 22 04:00:23 np0005591762 kernel: TSC deadline timer available
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Max. logical packages:   4
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Max. logical dies:       4
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Max. dies per package:   1
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Max. threads per core:   1
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Num. cores per package:     1
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Num. threads per package:   1
Jan 22 04:00:23 np0005591762 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: KVM setup pv remote TLB flush
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: setup PV sched yield
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0x7ffdb000-0x7fffffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0x80000000-0xafffffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xb0000000-0xbfffffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfed1bfff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xfed1c000-0xfed1ffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xfed20000-0xfeffbfff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 22 04:00:23 np0005591762 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 22 04:00:23 np0005591762 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices
Jan 22 04:00:23 np0005591762 kernel: Booting paravirtualized kernel on KVM
Jan 22 04:00:23 np0005591762 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 22 04:00:23 np0005591762 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1
Jan 22 04:00:23 np0005591762 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u524288
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: PV spinlocks enabled
Jan 22 04:00:23 np0005591762 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 04:00:23 np0005591762 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 22 04:00:23 np0005591762 kernel: random: crng init done
Jan 22 04:00:23 np0005591762 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: Fallback order for Node 0: 0 
Jan 22 04:00:23 np0005591762 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 22 04:00:23 np0005591762 kernel: Policy zone: Normal
Jan 22 04:00:23 np0005591762 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 22 04:00:23 np0005591762 kernel: software IO TLB: area num 4.
Jan 22 04:00:23 np0005591762 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1
Jan 22 04:00:23 np0005591762 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 22 04:00:23 np0005591762 kernel: ftrace: allocated 194 pages with 3 groups
Jan 22 04:00:23 np0005591762 kernel: Dynamic Preempt: voluntary
Jan 22 04:00:23 np0005591762 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 22 04:00:23 np0005591762 kernel: rcu: #011RCU event tracing is enabled.
Jan 22 04:00:23 np0005591762 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=4.
Jan 22 04:00:23 np0005591762 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 22 04:00:23 np0005591762 kernel: #011Rude variant of Tasks RCU enabled.
Jan 22 04:00:23 np0005591762 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 22 04:00:23 np0005591762 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 22 04:00:23 np0005591762 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4
Jan 22 04:00:23 np0005591762 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 22 04:00:23 np0005591762 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 22 04:00:23 np0005591762 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 22 04:00:23 np0005591762 kernel: NR_IRQS: 524544, nr_irqs: 456, preallocated irqs: 16
Jan 22 04:00:23 np0005591762 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 22 04:00:23 np0005591762 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 22 04:00:23 np0005591762 kernel: Console: colour VGA+ 80x25
Jan 22 04:00:23 np0005591762 kernel: printk: console [ttyS0] enabled
Jan 22 04:00:23 np0005591762 kernel: ACPI: Core revision 20230331
Jan 22 04:00:23 np0005591762 kernel: APIC: Switch to symmetric I/O mode setup
Jan 22 04:00:23 np0005591762 kernel: x2apic enabled
Jan 22 04:00:23 np0005591762 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask()
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself()
Jan 22 04:00:23 np0005591762 kernel: kvm-guest: setup PV IPIs
Jan 22 04:00:23 np0005591762 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 22 04:00:23 np0005591762 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404)
Jan 22 04:00:23 np0005591762 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 22 04:00:23 np0005591762 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 22 04:00:23 np0005591762 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 22 04:00:23 np0005591762 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 22 04:00:23 np0005591762 kernel: Spectre V2 : Mitigation: Retpolines
Jan 22 04:00:23 np0005591762 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 22 04:00:23 np0005591762 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls
Jan 22 04:00:23 np0005591762 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 22 04:00:23 np0005591762 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 22 04:00:23 np0005591762 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 22 04:00:23 np0005591762 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 22 04:00:23 np0005591762 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 22 04:00:23 np0005591762 kernel: Transient Scheduler Attacks: Vulnerable: No microcode
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers'
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: xstate_offset[9]:  832, xstate_sizes[9]:    8
Jan 22 04:00:23 np0005591762 kernel: x86/fpu: Enabled xstate features 0x207, context size is 840 bytes, using 'compacted' format.
Jan 22 04:00:23 np0005591762 kernel: Freeing SMP alternatives memory: 40K
Jan 22 04:00:23 np0005591762 kernel: pid_max: default: 32768 minimum: 301
Jan 22 04:00:23 np0005591762 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 22 04:00:23 np0005591762 kernel: landlock: Up and running.
Jan 22 04:00:23 np0005591762 kernel: Yama: becoming mindful.
Jan 22 04:00:23 np0005591762 kernel: SELinux:  Initializing.
Jan 22 04:00:23 np0005591762 kernel: LSM support for eBPF active
Jan 22 04:00:23 np0005591762 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1)
Jan 22 04:00:23 np0005591762 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 22 04:00:23 np0005591762 kernel: ... version:                0
Jan 22 04:00:23 np0005591762 kernel: ... bit width:              48
Jan 22 04:00:23 np0005591762 kernel: ... generic registers:      6
Jan 22 04:00:23 np0005591762 kernel: ... value mask:             0000ffffffffffff
Jan 22 04:00:23 np0005591762 kernel: ... max period:             00007fffffffffff
Jan 22 04:00:23 np0005591762 kernel: ... fixed-purpose events:   0
Jan 22 04:00:23 np0005591762 kernel: ... event mask:             000000000000003f
Jan 22 04:00:23 np0005591762 kernel: signal: max sigframe size: 3376
Jan 22 04:00:23 np0005591762 kernel: rcu: Hierarchical SRCU implementation.
Jan 22 04:00:23 np0005591762 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 22 04:00:23 np0005591762 kernel: smp: Bringing up secondary CPUs ...
Jan 22 04:00:23 np0005591762 kernel: smpboot: x86: Booting SMP configuration:
Jan 22 04:00:23 np0005591762 kernel: .... node  #0, CPUs:      #1 #2 #3
Jan 22 04:00:23 np0005591762 kernel: smp: Brought up 1 node, 4 CPUs
Jan 22 04:00:23 np0005591762 kernel: smpboot: Total of 4 processors activated (19563.23 BogoMIPS)
Jan 22 04:00:23 np0005591762 kernel: node 0 deferred pages initialised in 8ms
Jan 22 04:00:23 np0005591762 kernel: Memory: 7766200K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 617308K reserved, 0K cma-reserved)
Jan 22 04:00:23 np0005591762 kernel: devtmpfs: initialized
Jan 22 04:00:23 np0005591762 kernel: x86/mm: Memory block size: 128MB
Jan 22 04:00:23 np0005591762 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 22 04:00:23 np0005591762 kernel: futex hash table entries: 1024 (65536 bytes on 1 NUMA nodes, total 64 KiB, linear).
Jan 22 04:00:23 np0005591762 kernel: pinctrl core: initialized pinctrl subsystem
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 22 04:00:23 np0005591762 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 22 04:00:23 np0005591762 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 22 04:00:23 np0005591762 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 22 04:00:23 np0005591762 kernel: audit: initializing netlink subsys (disabled)
Jan 22 04:00:23 np0005591762 kernel: audit: type=2000 audit(1769072422.692:1): state=initialized audit_enabled=0 res=1
Jan 22 04:00:23 np0005591762 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 22 04:00:23 np0005591762 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 22 04:00:23 np0005591762 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 22 04:00:23 np0005591762 kernel: cpuidle: using governor menu
Jan 22 04:00:23 np0005591762 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 22 04:00:23 np0005591762 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff]
Jan 22 04:00:23 np0005591762 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry
Jan 22 04:00:23 np0005591762 kernel: PCI: Using configuration type 1 for base access
Jan 22 04:00:23 np0005591762 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 22 04:00:23 np0005591762 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 22 04:00:23 np0005591762 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 22 04:00:23 np0005591762 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 22 04:00:23 np0005591762 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 22 04:00:23 np0005591762 kernel: Demotion targets for Node 0: null
Jan 22 04:00:23 np0005591762 kernel: cryptd: max_cpu_qlen set to 1000
Jan 22 04:00:23 np0005591762 kernel: ACPI: Added _OSI(Module Device)
Jan 22 04:00:23 np0005591762 kernel: ACPI: Added _OSI(Processor Device)
Jan 22 04:00:23 np0005591762 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 22 04:00:23 np0005591762 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 22 04:00:23 np0005591762 kernel: ACPI: Interpreter enabled
Jan 22 04:00:23 np0005591762 kernel: ACPI: PM: (supports S0 S5)
Jan 22 04:00:23 np0005591762 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 22 04:00:23 np0005591762 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 22 04:00:23 np0005591762 kernel: PCI: Using E820 reservations for host bridge windows
Jan 22 04:00:23 np0005591762 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 22 04:00:23 np0005591762 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 22 04:00:23 np0005591762 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR DPC]
Jan 22 04:00:23 np0005591762 kernel: acpi PNP0A08:00: _OSC: OS now controls [SHPCHotplug PME AER PCIeCapability]
Jan 22 04:00:23 np0005591762 kernel: PCI host bridge to bus 0000:00
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [mem 0x280000000-0xa7fffffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: BAR 0 [mem 0xf9800000-0xf9ffffff pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfc200000-0xfc203fff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: BAR 0 [mem 0xfea1a000-0xfea1afff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: BAR 0 [mem 0xfea1b000-0xfea1bfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: BAR 0 [mem 0xfea1c000-0xfea1cfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: BAR 0 [mem 0xfea1d000-0xfea1dfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: BAR 0 [mem 0xfea1e000-0xfea1efff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: BAR 0 [mem 0xfea1f000-0xfea1ffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: BAR 0 [mem 0xfea20000-0xfea20fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: BAR 0 [mem 0xfea21000-0xfea21fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.0: quirk: [io  0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.2: BAR 4 [io  0xd040-0xd05f]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea22000-0xfea22fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:1f.3: BAR 4 [io  0x0700-0x073f]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfc800000-0xfc8000ff 64bit]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:02: extended config space not accessible
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [1] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [2] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [3] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [4] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [5] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [6] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [7] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [8] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [9] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [10] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [11] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [12] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [13] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [14] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [15] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [16] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [17] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [18] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [19] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [20] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [21] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [22] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [23] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [24] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [25] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [26] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [27] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [28] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [29] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [30] registered
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [31] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:02:01.0: BAR 4 [io  0xc000-0xc01f]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-2] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe840000-0xfe840fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfbe00000-0xfbe03fff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:03:00.0: ROM [mem 0xfe800000-0xfe83ffff pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-3] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:04:00.0: BAR 1 [mem 0xfe600000-0xfe600fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfbc00000-0xfbc03fff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-4] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfba00000-0xfba03fff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-5] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint
Jan 22 04:00:23 np0005591762 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfb800000-0xfb803fff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-6] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-7] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-8] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-9] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-10] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-11] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-12] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-13] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-14] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-15] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-16] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 22 04:00:23 np0005591762 kernel: acpiphp: Slot [0-17] registered
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22
Jan 22 04:00:23 np0005591762 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23
Jan 22 04:00:23 np0005591762 kernel: iommu: Default domain type: Translated
Jan 22 04:00:23 np0005591762 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 22 04:00:23 np0005591762 kernel: SCSI subsystem initialized
Jan 22 04:00:23 np0005591762 kernel: ACPI: bus type USB registered
Jan 22 04:00:23 np0005591762 kernel: usbcore: registered new interface driver usbfs
Jan 22 04:00:23 np0005591762 kernel: usbcore: registered new interface driver hub
Jan 22 04:00:23 np0005591762 kernel: usbcore: registered new device driver usb
Jan 22 04:00:23 np0005591762 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 22 04:00:23 np0005591762 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 22 04:00:23 np0005591762 kernel: PTP clock support registered
Jan 22 04:00:23 np0005591762 kernel: EDAC MC: Ver: 3.0.0
Jan 22 04:00:23 np0005591762 kernel: NetLabel: Initializing
Jan 22 04:00:23 np0005591762 kernel: NetLabel:  domain hash size = 128
Jan 22 04:00:23 np0005591762 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 22 04:00:23 np0005591762 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 22 04:00:23 np0005591762 kernel: PCI: Using ACPI for IRQ routing
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: vgaarb: bridge control possible
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 22 04:00:23 np0005591762 kernel: vgaarb: loaded
Jan 22 04:00:23 np0005591762 kernel: clocksource: Switched to clocksource kvm-clock
Jan 22 04:00:23 np0005591762 kernel: VFS: Disk quotas dquot_6.6.0
Jan 22 04:00:23 np0005591762 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 22 04:00:23 np0005591762 kernel: pnp: PnP ACPI init
Jan 22 04:00:23 np0005591762 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved
Jan 22 04:00:23 np0005591762 kernel: pnp: PnP ACPI: found 5 devices
Jan 22 04:00:23 np0005591762 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_INET protocol family
Jan 22 04:00:23 np0005591762 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 22 04:00:23 np0005591762 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_XDP protocol family
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x0fff] to [bus 03] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: bridge window [io  0x1000-0x0fff] to [bus 04] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: bridge window [io  0x1000-0x0fff] to [bus 05] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: bridge window [io  0x1000-0x0fff] to [bus 06] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: bridge window [io  0x1000-0x0fff] to [bus 07] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: bridge window [io  0x1000-0x0fff] to [bus 08] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: bridge window [io  0x1000-0x0fff] to [bus 09] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: bridge window [io  0x1000-0x0fff] to [bus 0a] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: bridge window [io  0x1000-0x0fff] to [bus 0b] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: bridge window [io  0x1000-0x0fff] to [bus 0c] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: bridge window [io  0x1000-0x0fff] to [bus 0d] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: bridge window [io  0x1000-0x0fff] to [bus 0e] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: bridge window [io  0x1000-0x0fff] to [bus 0f] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: bridge window [io  0x1000-0x0fff] to [bus 10] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: bridge window [io  0x1000-0x0fff] to [bus 11] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x0fff] to [bus 12] add_size 1000
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x1fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: bridge window [io  0x2000-0x2fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: bridge window [io  0x3000-0x3fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: bridge window [io  0x4000-0x4fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: bridge window [io  0x5000-0x5fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: bridge window [io  0x6000-0x6fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: bridge window [io  0x7000-0x7fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: bridge window [io  0x8000-0x8fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: bridge window [io  0x9000-0x9fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: bridge window [io  0xa000-0xafff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: bridge window [io  0xb000-0xbfff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: bridge window [io  0xe000-0xefff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: bridge window [io  0xf000-0xffff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: can't assign; no space
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: failed to assign
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: can't assign; no space
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: failed to assign
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: can't assign; no space
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: failed to assign
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x1fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: bridge window [io  0x2000-0x2fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: bridge window [io  0x3000-0x3fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: bridge window [io  0x4000-0x4fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: bridge window [io  0x5000-0x5fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: bridge window [io  0x6000-0x6fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: bridge window [io  0x7000-0x7fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: bridge window [io  0x8000-0x8fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: bridge window [io  0x9000-0x9fff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: bridge window [io  0xa000-0xafff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: bridge window [io  0xb000-0xbfff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: bridge window [io  0xe000-0xefff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: bridge window [io  0xf000-0xffff]: assigned
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: can't assign; no space
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: failed to assign
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: can't assign; no space
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: failed to assign
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: can't assign; no space
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: failed to assign
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4:   bridge window [io  0xf000-0xffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5:   bridge window [io  0xe000-0xefff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6:   bridge window [io  0xb000-0xbfff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7:   bridge window [io  0xa000-0xafff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0:   bridge window [io  0x9000-0x9fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1:   bridge window [io  0x8000-0x8fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2:   bridge window [io  0x7000-0x7fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3:   bridge window [io  0x6000-0x6fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4:   bridge window [io  0x5000-0x5fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5:   bridge window [io  0x4000-0x4fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6:   bridge window [io  0x3000-0x3fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7:   bridge window [io  0x2000-0x2fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0:   bridge window [io  0x1000-0x1fff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Jan 22 04:00:23 np0005591762 kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:00: resource 9 [mem 0x280000000-0xa7fffffff window]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:01: resource 0 [io  0xc000-0xcfff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:01: resource 1 [mem 0xfc600000-0xfc9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:01: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:02: resource 0 [io  0xc000-0xcfff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:02: resource 1 [mem 0xfc600000-0xfc7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:02: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:03: resource 2 [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:04: resource 2 [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:05: resource 2 [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:06: resource 0 [io  0xf000-0xffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:06: resource 2 [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:07: resource 0 [io  0xe000-0xefff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:07: resource 2 [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:08: resource 0 [io  0xb000-0xbfff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:08: resource 2 [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:09: resource 0 [io  0xa000-0xafff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:09: resource 2 [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0a: resource 0 [io  0x9000-0x9fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0a: resource 1 [mem 0xfda00000-0xfdbfffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0a: resource 2 [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0b: resource 0 [io  0x8000-0x8fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd800000-0xfd9fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0b: resource 2 [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0c: resource 0 [io  0x7000-0x7fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd600000-0xfd7fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0c: resource 2 [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0d: resource 0 [io  0x6000-0x6fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0d: resource 1 [mem 0xfd400000-0xfd5fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0d: resource 2 [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0e: resource 0 [io  0x5000-0x5fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0e: resource 1 [mem 0xfd200000-0xfd3fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0e: resource 2 [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0f: resource 0 [io  0x4000-0x4fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0f: resource 1 [mem 0xfd000000-0xfd1fffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:0f: resource 2 [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:10: resource 0 [io  0x3000-0x3fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:10: resource 1 [mem 0xfce00000-0xfcffffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:10: resource 2 [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:11: resource 0 [io  0x2000-0x2fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:11: resource 1 [mem 0xfcc00000-0xfcdfffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:11: resource 2 [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:12: resource 0 [io  0x1000-0x1fff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:12: resource 1 [mem 0xfca00000-0xfcbfffff]
Jan 22 04:00:23 np0005591762 kernel: pci_bus 0000:12: resource 2 [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 22 04:00:23 np0005591762 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22
Jan 22 04:00:23 np0005591762 kernel: PCI: CLS 0 bytes, default 64
Jan 22 04:00:23 np0005591762 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 22 04:00:23 np0005591762 kernel: software IO TLB: mapped [mem 0x000000006b000000-0x000000006f000000] (64MB)
Jan 22 04:00:23 np0005591762 kernel: Trying to unpack rootfs image as initramfs...
Jan 22 04:00:23 np0005591762 kernel: ACPI: bus type thunderbolt registered
Jan 22 04:00:23 np0005591762 kernel: Initialise system trusted keyrings
Jan 22 04:00:23 np0005591762 kernel: Key type blacklist registered
Jan 22 04:00:23 np0005591762 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 22 04:00:23 np0005591762 kernel: zbud: loaded
Jan 22 04:00:23 np0005591762 kernel: integrity: Platform Keyring initialized
Jan 22 04:00:23 np0005591762 kernel: integrity: Machine keyring initialized
Jan 22 04:00:23 np0005591762 kernel: Freeing initrd memory: 87956K
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_ALG protocol family
Jan 22 04:00:23 np0005591762 kernel: xor: automatically using best checksumming function   avx       
Jan 22 04:00:23 np0005591762 kernel: Key type asymmetric registered
Jan 22 04:00:23 np0005591762 kernel: Asymmetric key parser 'x509' registered
Jan 22 04:00:23 np0005591762 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 22 04:00:23 np0005591762 kernel: io scheduler mq-deadline registered
Jan 22 04:00:23 np0005591762 kernel: io scheduler kyber registered
Jan 22 04:00:23 np0005591762 kernel: io scheduler bfq registered
Jan 22 04:00:23 np0005591762 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31
Jan 22 04:00:23 np0005591762 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39
Jan 22 04:00:23 np0005591762 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40
Jan 22 04:00:23 np0005591762 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40
Jan 22 04:00:23 np0005591762 kernel: shpchp 0000:01:00.0: HPC vendor_id 1b36 device_id e ss_vid 0 ss_did 0
Jan 22 04:00:23 np0005591762 kernel: shpchp 0000:01:00.0: pci_hp_register failed with error -16
Jan 22 04:00:23 np0005591762 kernel: shpchp 0000:01:00.0: Slot initialization failed
Jan 22 04:00:23 np0005591762 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 22 04:00:23 np0005591762 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 22 04:00:23 np0005591762 kernel: ACPI: button: Power Button [PWRF]
Jan 22 04:00:23 np0005591762 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21
Jan 22 04:00:23 np0005591762 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 22 04:00:23 np0005591762 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 22 04:00:23 np0005591762 kernel: Non-volatile memory driver v1.3
Jan 22 04:00:23 np0005591762 kernel: rdac: device handler registered
Jan 22 04:00:23 np0005591762 kernel: hp_sw: device handler registered
Jan 22 04:00:23 np0005591762 kernel: emc: device handler registered
Jan 22 04:00:23 np0005591762 kernel: alua: device handler registered
Jan 22 04:00:23 np0005591762 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller
Jan 22 04:00:23 np0005591762 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1
Jan 22 04:00:23 np0005591762 kernel: uhci_hcd 0000:02:01.0: detected 2 ports
Jan 22 04:00:23 np0005591762 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x0000c000
Jan 22 04:00:23 np0005591762 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 22 04:00:23 np0005591762 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 22 04:00:23 np0005591762 kernel: usb usb1: Product: UHCI Host Controller
Jan 22 04:00:23 np0005591762 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 22 04:00:23 np0005591762 kernel: usb usb1: SerialNumber: 0000:02:01.0
Jan 22 04:00:23 np0005591762 kernel: hub 1-0:1.0: USB hub found
Jan 22 04:00:23 np0005591762 kernel: hub 1-0:1.0: 2 ports detected
Jan 22 04:00:23 np0005591762 kernel: usbcore: registered new interface driver usbserial_generic
Jan 22 04:00:23 np0005591762 kernel: usbserial: USB Serial support registered for generic
Jan 22 04:00:23 np0005591762 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 22 04:00:23 np0005591762 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 22 04:00:23 np0005591762 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 22 04:00:23 np0005591762 kernel: mousedev: PS/2 mouse device common for all mice
Jan 22 04:00:23 np0005591762 kernel: rtc_cmos 00:03: RTC can wake from S4
Jan 22 04:00:23 np0005591762 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 22 04:00:23 np0005591762 kernel: rtc_cmos 00:03: registered as rtc0
Jan 22 04:00:23 np0005591762 kernel: rtc_cmos 00:03: setting system clock to 2026-01-22T09:00:23 UTC (1769072423)
Jan 22 04:00:23 np0005591762 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 22 04:00:23 np0005591762 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram
Jan 22 04:00:23 np0005591762 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 22 04:00:23 np0005591762 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 22 04:00:23 np0005591762 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 22 04:00:23 np0005591762 kernel: usbcore: registered new interface driver usbhid
Jan 22 04:00:23 np0005591762 kernel: usbhid: USB HID core driver
Jan 22 04:00:23 np0005591762 kernel: drop_monitor: Initializing network drop monitor service
Jan 22 04:00:23 np0005591762 kernel: Initializing XFRM netlink socket
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_INET6 protocol family
Jan 22 04:00:23 np0005591762 kernel: Segment Routing with IPv6
Jan 22 04:00:23 np0005591762 kernel: NET: Registered PF_PACKET protocol family
Jan 22 04:00:23 np0005591762 kernel: mpls_gso: MPLS GSO support
Jan 22 04:00:23 np0005591762 kernel: IPI shorthand broadcast: enabled
Jan 22 04:00:23 np0005591762 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 22 04:00:23 np0005591762 kernel: AES CTR mode by8 optimization enabled
Jan 22 04:00:23 np0005591762 kernel: sched_clock: Marking stable (1129002011, 145289131)->(1343339136, -69047994)
Jan 22 04:00:23 np0005591762 kernel: registered taskstats version 1
Jan 22 04:00:23 np0005591762 kernel: Loading compiled-in X.509 certificates
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 22 04:00:23 np0005591762 kernel: Demotion targets for Node 0: null
Jan 22 04:00:23 np0005591762 kernel: page_owner is disabled
Jan 22 04:00:23 np0005591762 kernel: Key type .fscrypt registered
Jan 22 04:00:23 np0005591762 kernel: Key type fscrypt-provisioning registered
Jan 22 04:00:23 np0005591762 kernel: Key type big_key registered
Jan 22 04:00:23 np0005591762 kernel: Key type encrypted registered
Jan 22 04:00:23 np0005591762 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 22 04:00:23 np0005591762 kernel: Loading compiled-in module X.509 certificates
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 04:00:23 np0005591762 kernel: ima: Allocated hash algorithm: sha256
Jan 22 04:00:23 np0005591762 kernel: ima: No architecture policies found
Jan 22 04:00:23 np0005591762 kernel: evm: Initialising EVM extended attributes:
Jan 22 04:00:23 np0005591762 kernel: evm: security.selinux
Jan 22 04:00:23 np0005591762 kernel: evm: security.SMACK64 (disabled)
Jan 22 04:00:23 np0005591762 kernel: evm: security.SMACK64EXEC (disabled)
Jan 22 04:00:23 np0005591762 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 22 04:00:23 np0005591762 kernel: evm: security.SMACK64MMAP (disabled)
Jan 22 04:00:23 np0005591762 kernel: evm: security.apparmor (disabled)
Jan 22 04:00:23 np0005591762 kernel: evm: security.ima
Jan 22 04:00:23 np0005591762 kernel: evm: security.capability
Jan 22 04:00:23 np0005591762 kernel: evm: HMAC attrs: 0x1
Jan 22 04:00:23 np0005591762 kernel: Running certificate verification RSA selftest
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 22 04:00:23 np0005591762 kernel: Running certificate verification ECDSA selftest
Jan 22 04:00:23 np0005591762 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 22 04:00:23 np0005591762 kernel: clk: Disabling unused clocks
Jan 22 04:00:23 np0005591762 kernel: Freeing unused decrypted memory: 2028K
Jan 22 04:00:23 np0005591762 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 22 04:00:23 np0005591762 kernel: Write protecting the kernel read-only data: 30720k
Jan 22 04:00:23 np0005591762 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 22 04:00:23 np0005591762 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 22 04:00:23 np0005591762 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 22 04:00:23 np0005591762 kernel: Run /init as init process
Jan 22 04:00:23 np0005591762 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 04:00:23 np0005591762 systemd: Detected virtualization kvm.
Jan 22 04:00:23 np0005591762 systemd: Detected architecture x86-64.
Jan 22 04:00:23 np0005591762 systemd: Running in initrd.
Jan 22 04:00:23 np0005591762 systemd: No hostname configured, using default hostname.
Jan 22 04:00:23 np0005591762 systemd: Hostname set to <localhost>.
Jan 22 04:00:23 np0005591762 systemd: Initializing machine ID from VM UUID.
Jan 22 04:00:23 np0005591762 systemd: Queued start job for default target Initrd Default Target.
Jan 22 04:00:23 np0005591762 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 04:00:23 np0005591762 systemd: Reached target Local Encrypted Volumes.
Jan 22 04:00:23 np0005591762 systemd: Reached target Initrd /usr File System.
Jan 22 04:00:23 np0005591762 systemd: Reached target Local File Systems.
Jan 22 04:00:23 np0005591762 systemd: Reached target Path Units.
Jan 22 04:00:23 np0005591762 systemd: Reached target Slice Units.
Jan 22 04:00:23 np0005591762 systemd: Reached target Swaps.
Jan 22 04:00:23 np0005591762 systemd: Reached target Timer Units.
Jan 22 04:00:23 np0005591762 systemd: Listening on D-Bus System Message Bus Socket.
Jan 22 04:00:23 np0005591762 systemd: Listening on Journal Socket (/dev/log).
Jan 22 04:00:23 np0005591762 systemd: Listening on Journal Socket.
Jan 22 04:00:23 np0005591762 systemd: Listening on udev Control Socket.
Jan 22 04:00:23 np0005591762 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 22 04:00:23 np0005591762 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 22 04:00:23 np0005591762 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 22 04:00:23 np0005591762 kernel: usb 1-1: Manufacturer: QEMU
Jan 22 04:00:23 np0005591762 systemd: Listening on udev Kernel Socket.
Jan 22 04:00:23 np0005591762 kernel: usb 1-1: SerialNumber: 28754-0000:00:02.0:00.0:01.0-1
Jan 22 04:00:23 np0005591762 systemd: Reached target Socket Units.
Jan 22 04:00:23 np0005591762 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 22 04:00:23 np0005591762 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0
Jan 22 04:00:23 np0005591762 systemd: Starting Create List of Static Device Nodes...
Jan 22 04:00:23 np0005591762 systemd: Starting Journal Service...
Jan 22 04:00:23 np0005591762 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 04:00:23 np0005591762 systemd: Starting Apply Kernel Variables...
Jan 22 04:00:23 np0005591762 systemd: Starting Create System Users...
Jan 22 04:00:23 np0005591762 systemd: Starting Setup Virtual Console...
Jan 22 04:00:23 np0005591762 systemd: Finished Create List of Static Device Nodes.
Jan 22 04:00:23 np0005591762 systemd: Finished Apply Kernel Variables.
Jan 22 04:00:23 np0005591762 systemd: Finished Create System Users.
Jan 22 04:00:23 np0005591762 systemd: Starting Create Static Device Nodes in /dev...
Jan 22 04:00:23 np0005591762 systemd-journald[282]: Journal started
Jan 22 04:00:23 np0005591762 systemd-journald[282]: Runtime Journal (/run/log/journal/5cdfdaefd5ed40c6865aabf2be70f95e) is 8.0M, max 153.6M, 145.6M free.
Jan 22 04:00:23 np0005591762 systemd-sysusers[286]: Creating group 'users' with GID 100.
Jan 22 04:00:23 np0005591762 systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Jan 22 04:00:23 np0005591762 systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 22 04:00:23 np0005591762 systemd: Started Journal Service.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 04:00:24 np0005591762 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 04:00:24 np0005591762 systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 04:00:24 np0005591762 systemd[1]: Finished Setup Virtual Console.
Jan 22 04:00:24 np0005591762 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting dracut cmdline hook...
Jan 22 04:00:24 np0005591762 dracut-cmdline[302]: dracut-9 dracut-057-102.git20250818.el9
Jan 22 04:00:24 np0005591762 dracut-cmdline[302]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 04:00:24 np0005591762 systemd[1]: Finished dracut cmdline hook.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting dracut pre-udev hook...
Jan 22 04:00:24 np0005591762 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 22 04:00:24 np0005591762 kernel: device-mapper: uevent: version 1.0.3
Jan 22 04:00:24 np0005591762 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 22 04:00:24 np0005591762 kernel: RPC: Registered named UNIX socket transport module.
Jan 22 04:00:24 np0005591762 kernel: RPC: Registered udp transport module.
Jan 22 04:00:24 np0005591762 kernel: RPC: Registered tcp transport module.
Jan 22 04:00:24 np0005591762 kernel: RPC: Registered tcp-with-tls transport module.
Jan 22 04:00:24 np0005591762 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 22 04:00:24 np0005591762 rpc.statd[416]: Version 2.5.4 starting
Jan 22 04:00:24 np0005591762 rpc.statd[416]: Initializing NSM state
Jan 22 04:00:24 np0005591762 rpc.idmapd[421]: Setting log level to 0
Jan 22 04:00:24 np0005591762 systemd[1]: Finished dracut pre-udev hook.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 04:00:24 np0005591762 systemd-udevd[434]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 04:00:24 np0005591762 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting dracut pre-trigger hook...
Jan 22 04:00:24 np0005591762 systemd[1]: Finished dracut pre-trigger hook.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting Coldplug All udev Devices...
Jan 22 04:00:24 np0005591762 systemd[1]: Created slice Slice /system/modprobe.
Jan 22 04:00:24 np0005591762 systemd[1]: Starting Load Kernel Module configfs...
Jan 22 04:00:24 np0005591762 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 04:00:24 np0005591762 systemd[1]: Finished Load Kernel Module configfs.
Jan 22 04:00:24 np0005591762 systemd[1]: Finished Coldplug All udev Devices.
Jan 22 04:00:24 np0005591762 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 04:00:24 np0005591762 systemd[1]: Reached target Network.
Jan 22 04:00:24 np0005591762 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 04:00:24 np0005591762 systemd[1]: Starting dracut initqueue hook...
Jan 22 04:00:24 np0005591762 kernel: virtio_blk virtio2: 4/0/0 default/read/poll queues
Jan 22 04:00:24 np0005591762 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 22 04:00:24 np0005591762 kernel: vda: vda1
Jan 22 04:00:24 np0005591762 systemd-udevd[436]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:00:24 np0005591762 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16
Jan 22 04:00:24 np0005591762 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode
Jan 22 04:00:24 np0005591762 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f)
Jan 22 04:00:24 np0005591762 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only 
Jan 22 04:00:24 np0005591762 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 04:00:24 np0005591762 systemd[1]: Reached target Initrd Root Device.
Jan 22 04:00:24 np0005591762 kernel: scsi host0: ahci
Jan 22 04:00:24 np0005591762 kernel: scsi host1: ahci
Jan 22 04:00:24 np0005591762 kernel: scsi host2: ahci
Jan 22 04:00:24 np0005591762 kernel: scsi host3: ahci
Jan 22 04:00:24 np0005591762 kernel: scsi host4: ahci
Jan 22 04:00:24 np0005591762 kernel: scsi host5: ahci
Jan 22 04:00:24 np0005591762 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22100 irq 49 lpm-pol 0
Jan 22 04:00:24 np0005591762 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22180 irq 49 lpm-pol 0
Jan 22 04:00:24 np0005591762 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22200 irq 49 lpm-pol 0
Jan 22 04:00:24 np0005591762 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22280 irq 49 lpm-pol 0
Jan 22 04:00:24 np0005591762 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22300 irq 49 lpm-pol 0
Jan 22 04:00:24 np0005591762 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22380 irq 49 lpm-pol 0
Jan 22 04:00:24 np0005591762 systemd[1]: Mounting Kernel Configuration File System...
Jan 22 04:00:24 np0005591762 systemd[1]: Mounted Kernel Configuration File System.
Jan 22 04:00:24 np0005591762 systemd[1]: Reached target System Initialization.
Jan 22 04:00:24 np0005591762 systemd[1]: Reached target Basic System.
Jan 22 04:00:24 np0005591762 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300)
Jan 22 04:00:24 np0005591762 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 22 04:00:24 np0005591762 kernel: ata1.00: applying bridge limits
Jan 22 04:00:24 np0005591762 kernel: ata1.00: configured for UDMA/100
Jan 22 04:00:24 np0005591762 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 22 04:00:24 np0005591762 kernel: ata6: SATA link down (SStatus 0 SControl 300)
Jan 22 04:00:24 np0005591762 kernel: ata2: SATA link down (SStatus 0 SControl 300)
Jan 22 04:00:24 np0005591762 kernel: ata5: SATA link down (SStatus 0 SControl 300)
Jan 22 04:00:24 np0005591762 kernel: ata4: SATA link down (SStatus 0 SControl 300)
Jan 22 04:00:24 np0005591762 kernel: ata3: SATA link down (SStatus 0 SControl 300)
Jan 22 04:00:24 np0005591762 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 22 04:00:24 np0005591762 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 22 04:00:24 np0005591762 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 22 04:00:25 np0005591762 systemd[1]: Finished dracut initqueue hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Remote File Systems.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting dracut pre-mount hook...
Jan 22 04:00:25 np0005591762 systemd[1]: Finished dracut pre-mount hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 22 04:00:25 np0005591762 systemd-fsck[526]: /usr/sbin/fsck.xfs: XFS file system.
Jan 22 04:00:25 np0005591762 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 04:00:25 np0005591762 systemd[1]: Mounting /sysroot...
Jan 22 04:00:25 np0005591762 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 22 04:00:25 np0005591762 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 22 04:00:25 np0005591762 kernel: XFS (vda1): Ending clean mount
Jan 22 04:00:25 np0005591762 systemd[1]: Mounted /sysroot.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Initrd Root File System.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 22 04:00:25 np0005591762 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Initrd File Systems.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Initrd Default Target.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting dracut mount hook...
Jan 22 04:00:25 np0005591762 systemd[1]: Finished dracut mount hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 22 04:00:25 np0005591762 rpc.idmapd[421]: exiting on signal 15
Jan 22 04:00:25 np0005591762 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Network.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Timer Units.
Jan 22 04:00:25 np0005591762 systemd[1]: dbus.socket: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Initrd Default Target.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Basic System.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Initrd Root Device.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Initrd /usr File System.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Path Units.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Remote File Systems.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Slice Units.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Socket Units.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target System Initialization.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Local File Systems.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Swaps.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut mount hook.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut pre-mount hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut initqueue hook.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Apply Kernel Variables.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Coldplug All udev Devices.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut pre-trigger hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Setup Virtual Console.
Jan 22 04:00:25 np0005591762 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Closed udev Control Socket.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Closed udev Kernel Socket.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut pre-udev hook.
Jan 22 04:00:25 np0005591762 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped dracut cmdline hook.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting Cleanup udev Database...
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 22 04:00:25 np0005591762 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 22 04:00:25 np0005591762 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Stopped Create System Users.
Jan 22 04:00:25 np0005591762 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 22 04:00:25 np0005591762 systemd[1]: Finished Cleanup udev Database.
Jan 22 04:00:25 np0005591762 systemd[1]: Reached target Switch Root.
Jan 22 04:00:25 np0005591762 systemd[1]: Starting Switch Root...
Jan 22 04:00:25 np0005591762 systemd[1]: Switching root.
Jan 22 04:00:25 np0005591762 systemd-journald[282]: Journal stopped
Jan 22 04:00:26 np0005591762 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 22 04:00:26 np0005591762 kernel: audit: type=1404 audit(1769072425.842:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:00:26 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:00:26 np0005591762 kernel: audit: type=1403 audit(1769072425.962:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 22 04:00:26 np0005591762 systemd: Successfully loaded SELinux policy in 122.115ms.
Jan 22 04:00:26 np0005591762 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.427ms.
Jan 22 04:00:26 np0005591762 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 04:00:26 np0005591762 systemd: Detected virtualization kvm.
Jan 22 04:00:26 np0005591762 systemd: Detected architecture x86-64.
Jan 22 04:00:26 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:00:26 np0005591762 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd: Stopped Switch Root.
Jan 22 04:00:26 np0005591762 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 22 04:00:26 np0005591762 systemd: Created slice Slice /system/getty.
Jan 22 04:00:26 np0005591762 systemd: Created slice Slice /system/serial-getty.
Jan 22 04:00:26 np0005591762 systemd: Created slice Slice /system/sshd-keygen.
Jan 22 04:00:26 np0005591762 systemd: Created slice User and Session Slice.
Jan 22 04:00:26 np0005591762 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 04:00:26 np0005591762 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 22 04:00:26 np0005591762 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 22 04:00:26 np0005591762 systemd: Reached target Local Encrypted Volumes.
Jan 22 04:00:26 np0005591762 systemd: Stopped target Switch Root.
Jan 22 04:00:26 np0005591762 systemd: Stopped target Initrd File Systems.
Jan 22 04:00:26 np0005591762 systemd: Stopped target Initrd Root File System.
Jan 22 04:00:26 np0005591762 systemd: Reached target Local Integrity Protected Volumes.
Jan 22 04:00:26 np0005591762 systemd: Reached target Path Units.
Jan 22 04:00:26 np0005591762 systemd: Reached target rpc_pipefs.target.
Jan 22 04:00:26 np0005591762 systemd: Reached target Slice Units.
Jan 22 04:00:26 np0005591762 systemd: Reached target Swaps.
Jan 22 04:00:26 np0005591762 systemd: Reached target Local Verity Protected Volumes.
Jan 22 04:00:26 np0005591762 systemd: Listening on RPCbind Server Activation Socket.
Jan 22 04:00:26 np0005591762 systemd: Reached target RPC Port Mapper.
Jan 22 04:00:26 np0005591762 systemd: Listening on Process Core Dump Socket.
Jan 22 04:00:26 np0005591762 systemd: Listening on initctl Compatibility Named Pipe.
Jan 22 04:00:26 np0005591762 systemd: Listening on udev Control Socket.
Jan 22 04:00:26 np0005591762 systemd: Listening on udev Kernel Socket.
Jan 22 04:00:26 np0005591762 systemd: Mounting Huge Pages File System...
Jan 22 04:00:26 np0005591762 systemd: Mounting POSIX Message Queue File System...
Jan 22 04:00:26 np0005591762 systemd: Mounting Kernel Debug File System...
Jan 22 04:00:26 np0005591762 systemd: Mounting Kernel Trace File System...
Jan 22 04:00:26 np0005591762 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 04:00:26 np0005591762 systemd: Starting Create List of Static Device Nodes...
Jan 22 04:00:26 np0005591762 systemd: Starting Load Kernel Module configfs...
Jan 22 04:00:26 np0005591762 systemd: Starting Load Kernel Module drm...
Jan 22 04:00:26 np0005591762 systemd: Starting Load Kernel Module efi_pstore...
Jan 22 04:00:26 np0005591762 systemd: Starting Load Kernel Module fuse...
Jan 22 04:00:26 np0005591762 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 22 04:00:26 np0005591762 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd: Stopped File System Check on Root Device.
Jan 22 04:00:26 np0005591762 systemd: Stopped Journal Service.
Jan 22 04:00:26 np0005591762 systemd: Starting Journal Service...
Jan 22 04:00:26 np0005591762 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 04:00:26 np0005591762 kernel: fuse: init (API version 7.37)
Jan 22 04:00:26 np0005591762 systemd: Starting Generate network units from Kernel command line...
Jan 22 04:00:26 np0005591762 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 04:00:26 np0005591762 systemd: Starting Remount Root and Kernel File Systems...
Jan 22 04:00:26 np0005591762 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 22 04:00:26 np0005591762 systemd: Starting Apply Kernel Variables...
Jan 22 04:00:26 np0005591762 systemd: Starting Coldplug All udev Devices...
Jan 22 04:00:26 np0005591762 systemd: Mounted Huge Pages File System.
Jan 22 04:00:26 np0005591762 systemd-journald[650]: Journal started
Jan 22 04:00:26 np0005591762 systemd-journald[650]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 04:00:26 np0005591762 systemd[1]: Queued start job for default target Multi-User System.
Jan 22 04:00:26 np0005591762 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 kernel: ACPI: bus type drm_connector registered
Jan 22 04:00:26 np0005591762 systemd: Started Journal Service.
Jan 22 04:00:26 np0005591762 systemd[1]: Mounted POSIX Message Queue File System.
Jan 22 04:00:26 np0005591762 systemd[1]: Mounted Kernel Debug File System.
Jan 22 04:00:26 np0005591762 systemd[1]: Mounted Kernel Trace File System.
Jan 22 04:00:26 np0005591762 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Create List of Static Device Nodes.
Jan 22 04:00:26 np0005591762 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Load Kernel Module configfs.
Jan 22 04:00:26 np0005591762 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Load Kernel Module drm.
Jan 22 04:00:26 np0005591762 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 22 04:00:26 np0005591762 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Load Kernel Module fuse.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Generate network units from Kernel command line.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Apply Kernel Variables.
Jan 22 04:00:26 np0005591762 systemd[1]: Mounting FUSE Control File System...
Jan 22 04:00:26 np0005591762 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Rebuild Hardware Database...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 22 04:00:26 np0005591762 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Load/Save OS Random Seed...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Create System Users...
Jan 22 04:00:26 np0005591762 systemd-journald[650]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 04:00:26 np0005591762 systemd-journald[650]: Received client request to flush runtime journal.
Jan 22 04:00:26 np0005591762 systemd[1]: Mounted FUSE Control File System.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Load/Save OS Random Seed.
Jan 22 04:00:26 np0005591762 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Create System Users.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Coldplug All udev Devices.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target Preparation for Local File Systems.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target Local File Systems.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 22 04:00:26 np0005591762 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 22 04:00:26 np0005591762 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 22 04:00:26 np0005591762 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Automatic Boot Loader Update...
Jan 22 04:00:26 np0005591762 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 04:00:26 np0005591762 bootctl[668]: Couldn't find EFI system partition, skipping.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Automatic Boot Loader Update.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Security Auditing Service...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting RPC Bind...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Rebuild Journal Catalog...
Jan 22 04:00:26 np0005591762 auditd[675]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 22 04:00:26 np0005591762 auditd[675]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Rebuild Journal Catalog.
Jan 22 04:00:26 np0005591762 systemd[1]: Started RPC Bind.
Jan 22 04:00:26 np0005591762 augenrules[680]: /sbin/augenrules: No change
Jan 22 04:00:26 np0005591762 augenrules[695]: No rules
Jan 22 04:00:26 np0005591762 augenrules[695]: enabled 1
Jan 22 04:00:26 np0005591762 augenrules[695]: failure 1
Jan 22 04:00:26 np0005591762 augenrules[695]: pid 675
Jan 22 04:00:26 np0005591762 augenrules[695]: rate_limit 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_limit 8192
Jan 22 04:00:26 np0005591762 augenrules[695]: lost 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_wait_time 60000
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_wait_time_actual 0
Jan 22 04:00:26 np0005591762 augenrules[695]: enabled 1
Jan 22 04:00:26 np0005591762 augenrules[695]: failure 1
Jan 22 04:00:26 np0005591762 augenrules[695]: pid 675
Jan 22 04:00:26 np0005591762 augenrules[695]: rate_limit 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_limit 8192
Jan 22 04:00:26 np0005591762 augenrules[695]: lost 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog 3
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_wait_time 60000
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_wait_time_actual 0
Jan 22 04:00:26 np0005591762 augenrules[695]: enabled 1
Jan 22 04:00:26 np0005591762 augenrules[695]: failure 1
Jan 22 04:00:26 np0005591762 augenrules[695]: pid 675
Jan 22 04:00:26 np0005591762 augenrules[695]: rate_limit 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_limit 8192
Jan 22 04:00:26 np0005591762 augenrules[695]: lost 0
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog 3
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_wait_time 60000
Jan 22 04:00:26 np0005591762 augenrules[695]: backlog_wait_time_actual 0
Jan 22 04:00:26 np0005591762 systemd[1]: Started Security Auditing Service.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Rebuild Hardware Database.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Update is Completed...
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Update is Completed.
Jan 22 04:00:26 np0005591762 systemd-udevd[703]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 04:00:26 np0005591762 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target System Initialization.
Jan 22 04:00:26 np0005591762 systemd[1]: Started dnf makecache --timer.
Jan 22 04:00:26 np0005591762 systemd[1]: Started Daily rotation of log files.
Jan 22 04:00:26 np0005591762 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target Timer Units.
Jan 22 04:00:26 np0005591762 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 22 04:00:26 np0005591762 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target Socket Units.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting D-Bus System Message Bus...
Jan 22 04:00:26 np0005591762 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Load Kernel Module configfs...
Jan 22 04:00:26 np0005591762 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Load Kernel Module configfs.
Jan 22 04:00:26 np0005591762 systemd[1]: Started D-Bus System Message Bus.
Jan 22 04:00:26 np0005591762 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target Basic System.
Jan 22 04:00:26 np0005591762 dbus-broker-lau[712]: Ready
Jan 22 04:00:26 np0005591762 systemd[1]: Starting NTP client/server...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 22 04:00:26 np0005591762 systemd[1]: Starting IPv4 firewall with iptables...
Jan 22 04:00:26 np0005591762 systemd[1]: Started irqbalance daemon.
Jan 22 04:00:26 np0005591762 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 22 04:00:26 np0005591762 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 04:00:26 np0005591762 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 04:00:26 np0005591762 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target sshd-keygen.target.
Jan 22 04:00:26 np0005591762 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 22 04:00:26 np0005591762 systemd[1]: Reached target User and Group Name Lookups.
Jan 22 04:00:26 np0005591762 systemd[1]: Starting User Login Management...
Jan 22 04:00:26 np0005591762 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 22 04:00:27 np0005591762 systemd-udevd[731]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:00:27 np0005591762 chronyd[753]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 04:00:27 np0005591762 chronyd[753]: Loaded 0 symmetric keys
Jan 22 04:00:27 np0005591762 chronyd[753]: Using right/UTC timezone to obtain leap second data
Jan 22 04:00:27 np0005591762 chronyd[753]: Loaded seccomp filter (level 2)
Jan 22 04:00:27 np0005591762 systemd-logind[744]: New seat seat0.
Jan 22 04:00:27 np0005591762 systemd[1]: Started User Login Management.
Jan 22 04:00:27 np0005591762 systemd[1]: Started NTP client/server.
Jan 22 04:00:27 np0005591762 systemd-logind[744]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 04:00:27 np0005591762 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 22 04:00:27 np0005591762 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 22 04:00:27 np0005591762 systemd-logind[744]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 04:00:27 np0005591762 kernel: lpc_ich 0000:00:1f.0: I/O space for GPIO uninitialized
Jan 22 04:00:27 np0005591762 iptables.init[738]: iptables: Applying firewall rules: [  OK  ]
Jan 22 04:00:27 np0005591762 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt
Jan 22 04:00:27 np0005591762 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 22 04:00:27 np0005591762 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 22 04:00:27 np0005591762 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 22 04:00:27 np0005591762 systemd[1]: Finished IPv4 firewall with iptables.
Jan 22 04:00:27 np0005591762 kernel: iTCO_vendor_support: vendor-support=0
Jan 22 04:00:27 np0005591762 kernel: iTCO_wdt iTCO_wdt.1.auto: Found a ICH9 TCO device (Version=2, TCOBASE=0x0660)
Jan 22 04:00:27 np0005591762 kernel: iTCO_wdt iTCO_wdt.1.auto: initialized. heartbeat=30 sec (nowayout=0)
Jan 22 04:00:27 np0005591762 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0
Jan 22 04:00:27 np0005591762 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console
Jan 22 04:00:27 np0005591762 kernel: Console: switching to colour dummy device 80x25
Jan 22 04:00:27 np0005591762 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 22 04:00:27 np0005591762 kernel: [drm] features: -context_init
Jan 22 04:00:27 np0005591762 kernel: [drm] number of scanouts: 1
Jan 22 04:00:27 np0005591762 kernel: [drm] number of cap sets: 0
Jan 22 04:00:27 np0005591762 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0
Jan 22 04:00:27 np0005591762 kernel: kvm_amd: TSC scaling supported
Jan 22 04:00:27 np0005591762 kernel: kvm_amd: Nested Virtualization enabled
Jan 22 04:00:27 np0005591762 kernel: kvm_amd: Nested Paging enabled
Jan 22 04:00:27 np0005591762 kernel: kvm_amd: LBR virtualization supported
Jan 22 04:00:27 np0005591762 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported
Jan 22 04:00:27 np0005591762 kernel: kvm_amd: Virtual GIF supported
Jan 22 04:00:27 np0005591762 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 22 04:00:27 np0005591762 kernel: Console: switching to colour frame buffer device 160x50
Jan 22 04:00:27 np0005591762 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 22 04:00:27 np0005591762 cloud-init[795]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 22 Jan 2026 09:00:27 +0000. Up 5.08 seconds.
Jan 22 04:00:27 np0005591762 systemd[1]: run-cloud\x2dinit-tmp-tmpf6buq9fj.mount: Deactivated successfully.
Jan 22 04:00:27 np0005591762 systemd[1]: Starting Hostname Service...
Jan 22 04:00:27 np0005591762 systemd[1]: Started Hostname Service.
Jan 22 04:00:27 np0005591762 systemd-hostnamed[809]: Hostname set to <np0005591762> (static)
Jan 22 04:00:27 np0005591762 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 22 04:00:27 np0005591762 systemd[1]: Reached target Preparation for Network.
Jan 22 04:00:27 np0005591762 systemd[1]: Starting Network Manager...
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9105] NetworkManager (version 1.54.3-2.el9) is starting... (boot:8b9e49f6-dfde-4886-8f0c-7f0567b85e9e)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9108] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9187] manager[0x563a2c2ad000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9217] hostname: hostname: using hostnamed
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9217] hostname: static hostname changed from (none) to "np0005591762"
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9220] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9309] manager[0x563a2c2ad000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9310] manager[0x563a2c2ad000]: rfkill: WWAN hardware radio set enabled
Jan 22 04:00:27 np0005591762 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9360] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9360] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9360] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9361] manager: Networking is enabled by state file
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9364] settings: Loaded settings plugin: keyfile (internal)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9381] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9400] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9408] dhcp: init: Using DHCP client 'internal'
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9410] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9419] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9426] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9432] device (lo): Activation: starting connection 'lo' (8f4da813-534e-4822-a9a0-9dc45c872492)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9440] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9442] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:00:27 np0005591762 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 04:00:27 np0005591762 systemd[1]: Started Network Manager.
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9467] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9471] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9472] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9474] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9475] device (eth0): carrier: link connected
Jan 22 04:00:27 np0005591762 systemd[1]: Reached target Network.
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9477] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9481] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9488] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9490] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9491] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9493] manager: NetworkManager state is now CONNECTING
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9494] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9499] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:00:27 np0005591762 systemd[1]: Starting Network Manager Wait Online...
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9504] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9507] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 22 04:00:27 np0005591762 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9539] dhcp4 (eth0): state changed new lease, address=192.168.26.49
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9550] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 04:00:27 np0005591762 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9623] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9626] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 04:00:27 np0005591762 NetworkManager[813]: <info>  [1769072427.9635] device (lo): Activation: successful, device activated.
Jan 22 04:00:27 np0005591762 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 22 04:00:27 np0005591762 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 04:00:27 np0005591762 systemd[1]: Reached target NFS client services.
Jan 22 04:00:27 np0005591762 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 04:00:27 np0005591762 systemd[1]: Reached target Remote File Systems.
Jan 22 04:00:27 np0005591762 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 04:00:29 np0005591762 NetworkManager[813]: <info>  [1769072429.6444] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:00:30 np0005591762 NetworkManager[813]: <info>  [1769072430.6561] dhcp6 (eth0): state changed new lease, address=2001:db8::1cd
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0127] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0158] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0159] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0161] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0163] device (eth0): Activation: successful, device activated.
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0168] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 04:00:32 np0005591762 NetworkManager[813]: <info>  [1769072432.0169] manager: startup complete
Jan 22 04:00:32 np0005591762 systemd[1]: Finished Network Manager Wait Online.
Jan 22 04:00:32 np0005591762 systemd[1]: Starting Cloud-init: Network Stage...
Jan 22 04:00:32 np0005591762 cloud-init[879]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 22 Jan 2026 09:00:32 +0000. Up 9.85 seconds.
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |  eth0  | True |        192.168.26.49         | 255.255.255.0 | global | fa:16:3e:7c:73:fe |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |  eth0  | True |      2001:db8::1cd/128       |       .       | global | fa:16:3e:7c:73:fe |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |  eth0  | True | fe80::f816:3eff:fe7c:73fe/64 |       .       |  link  | fa:16:3e:7c:73:fe |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: | Route |   Destination   |   Gateway    |     Genmask     | Interface | Flags |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   0   |     0.0.0.0     | 192.168.26.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   1   | 169.254.169.254 | 192.168.26.2 | 255.255.255.255 |    eth0   |  UGH  |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   2   |   192.168.26.0  |   0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: ++++++++++++++++++++++Route IPv6 info++++++++++++++++++++++
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: | Route |  Destination  |   Gateway   | Interface | Flags |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   1   |  2001:db8::1  |      ::     |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   2   | 2001:db8::1cd |      ::     |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   3   |   fe80::/64   |      ::     |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   4   |      ::/0     | 2001:db8::1 |    eth0   |   UG  |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   6   |     local     |      ::     |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   7   |     local     |      ::     |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: |   8   |   multicast   |      ::     |    eth0   |   U   |
Jan 22 04:00:32 np0005591762 cloud-init[879]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 22 04:00:32 np0005591762 chronyd[753]: Selected source 108.61.215.221 (2.centos.pool.ntp.org)
Jan 22 04:00:32 np0005591762 chronyd[753]: System clock TAI offset set to 37 seconds
Jan 22 04:00:33 np0005591762 cloud-init[879]: Generating public/private rsa key pair.
Jan 22 04:00:33 np0005591762 cloud-init[879]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 22 04:00:33 np0005591762 cloud-init[879]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 22 04:00:33 np0005591762 cloud-init[879]: The key fingerprint is:
Jan 22 04:00:33 np0005591762 cloud-init[879]: SHA256:5gtBI3Hvj3RbewVOfjriLFk8JGoj74sB1xcCHMHryYQ root@np0005591762
Jan 22 04:00:33 np0005591762 cloud-init[879]: The key's randomart image is:
Jan 22 04:00:33 np0005591762 cloud-init[879]: +---[RSA 3072]----+
Jan 22 04:00:33 np0005591762 cloud-init[879]: |    .o=o         |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |     ooo         |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |    ..o.o .   o  |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |    Eoo+ ....+ . |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |    .+o.S.o+. o o|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |     o==+= o+. + |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |      o+o.ooo.+  |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |       +..oo o . |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |      ..+. .o    |
Jan 22 04:00:33 np0005591762 cloud-init[879]: +----[SHA256]-----+
Jan 22 04:00:33 np0005591762 cloud-init[879]: Generating public/private ecdsa key pair.
Jan 22 04:00:33 np0005591762 cloud-init[879]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 22 04:00:33 np0005591762 cloud-init[879]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 22 04:00:33 np0005591762 cloud-init[879]: The key fingerprint is:
Jan 22 04:00:33 np0005591762 cloud-init[879]: SHA256:VWmmvbWmOPquIVXjdhzJ/ZQ3rqg6AJJUksM+WgFQU7A root@np0005591762
Jan 22 04:00:33 np0005591762 cloud-init[879]: The key's randomart image is:
Jan 22 04:00:33 np0005591762 cloud-init[879]: +---[ECDSA 256]---+
Jan 22 04:00:33 np0005591762 cloud-init[879]: |=+*+.       ..   |
Jan 22 04:00:33 np0005591762 cloud-init[879]: | =oo       o+o  .|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |o E.      +=+ .oo|
Jan 22 04:00:33 np0005591762 cloud-init[879]: | =..     +.o..ooo|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |..o .   S o oo o.|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |.    . . . .o +  |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |      o .  o +   |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |       o .+ .    |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |       .=*o.     |
Jan 22 04:00:33 np0005591762 cloud-init[879]: +----[SHA256]-----+
Jan 22 04:00:33 np0005591762 cloud-init[879]: Generating public/private ed25519 key pair.
Jan 22 04:00:33 np0005591762 cloud-init[879]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 22 04:00:33 np0005591762 cloud-init[879]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 22 04:00:33 np0005591762 cloud-init[879]: The key fingerprint is:
Jan 22 04:00:33 np0005591762 cloud-init[879]: SHA256:0J7Em0VzOdgMTRD3mGqXXNVgRKfxPBqs/UXJHGwUXbM root@np0005591762
Jan 22 04:00:33 np0005591762 cloud-init[879]: The key's randomart image is:
Jan 22 04:00:33 np0005591762 cloud-init[879]: +--[ED25519 256]--+
Jan 22 04:00:33 np0005591762 cloud-init[879]: |          *X++O*O|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |       o ..+B=oXB|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |      . + . o+=E+|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |       + = oooo..|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |        S o.+o  .|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |         . .  . .|
Jan 22 04:00:33 np0005591762 cloud-init[879]: |               . |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |                 |
Jan 22 04:00:33 np0005591762 cloud-init[879]: |                 |
Jan 22 04:00:33 np0005591762 cloud-init[879]: +----[SHA256]-----+
Jan 22 04:00:33 np0005591762 systemd[1]: Finished Cloud-init: Network Stage.
Jan 22 04:00:33 np0005591762 systemd[1]: Reached target Cloud-config availability.
Jan 22 04:00:33 np0005591762 systemd[1]: Reached target Network is Online.
Jan 22 04:00:33 np0005591762 systemd[1]: Starting Cloud-init: Config Stage...
Jan 22 04:00:33 np0005591762 systemd[1]: Starting Crash recovery kernel arming...
Jan 22 04:00:33 np0005591762 systemd[1]: Starting Notify NFS peers of a restart...
Jan 22 04:00:33 np0005591762 systemd[1]: Starting System Logging Service...
Jan 22 04:00:33 np0005591762 sm-notify[962]: Version 2.5.4 starting
Jan 22 04:00:33 np0005591762 systemd[1]: Starting OpenSSH server daemon...
Jan 22 04:00:33 np0005591762 systemd[1]: Starting Permit User Sessions...
Jan 22 04:00:33 np0005591762 systemd[1]: Started Notify NFS peers of a restart.
Jan 22 04:00:33 np0005591762 systemd[1]: Started OpenSSH server daemon.
Jan 22 04:00:33 np0005591762 systemd[1]: Finished Permit User Sessions.
Jan 22 04:00:33 np0005591762 systemd[1]: Started Command Scheduler.
Jan 22 04:00:33 np0005591762 systemd[1]: Started Getty on tty1.
Jan 22 04:00:33 np0005591762 systemd[1]: Started Serial Getty on ttyS0.
Jan 22 04:00:33 np0005591762 systemd[1]: Reached target Login Prompts.
Jan 22 04:00:33 np0005591762 rsyslogd[963]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="963" x-info="https://www.rsyslog.com"] start
Jan 22 04:00:33 np0005591762 rsyslogd[963]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 22 04:00:33 np0005591762 systemd[1]: Started System Logging Service.
Jan 22 04:00:33 np0005591762 systemd[1]: Reached target Multi-User System.
Jan 22 04:00:33 np0005591762 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 22 04:00:33 np0005591762 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 22 04:00:33 np0005591762 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 22 04:00:33 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:00:33 np0005591762 kdumpctl[975]: kdump: No kdump initial ramdisk found.
Jan 22 04:00:33 np0005591762 kdumpctl[975]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 22 04:00:33 np0005591762 cloud-init[1174]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 22 Jan 2026 09:00:33 +0000. Up 11.18 seconds.
Jan 22 04:00:33 np0005591762 systemd[1]: Finished Cloud-init: Config Stage.
Jan 22 04:00:33 np0005591762 systemd[1]: Starting Cloud-init: Final Stage...
Jan 22 04:00:33 np0005591762 dracut[1241]: dracut-057-102.git20250818.el9
Jan 22 04:00:33 np0005591762 dracut[1243]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 22 04:00:33 np0005591762 cloud-init[1309]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 22 Jan 2026 09:00:33 +0000. Up 11.56 seconds.
Jan 22 04:00:34 np0005591762 cloud-init[1316]: #############################################################
Jan 22 04:00:34 np0005591762 cloud-init[1317]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 22 04:00:34 np0005591762 cloud-init[1322]: 256 SHA256:VWmmvbWmOPquIVXjdhzJ/ZQ3rqg6AJJUksM+WgFQU7A root@np0005591762 (ECDSA)
Jan 22 04:00:34 np0005591762 cloud-init[1324]: 256 SHA256:0J7Em0VzOdgMTRD3mGqXXNVgRKfxPBqs/UXJHGwUXbM root@np0005591762 (ED25519)
Jan 22 04:00:34 np0005591762 cloud-init[1329]: 3072 SHA256:5gtBI3Hvj3RbewVOfjriLFk8JGoj74sB1xcCHMHryYQ root@np0005591762 (RSA)
Jan 22 04:00:34 np0005591762 cloud-init[1330]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 22 04:00:34 np0005591762 cloud-init[1331]: #############################################################
Jan 22 04:00:34 np0005591762 cloud-init[1309]: Cloud-init v. 24.4-8.el9 finished at Thu, 22 Jan 2026 09:00:34 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.70 seconds
Jan 22 04:00:34 np0005591762 systemd[1]: Finished Cloud-init: Final Stage.
Jan 22 04:00:34 np0005591762 systemd[1]: Reached target Cloud-init target.
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: memstrack is not available
Jan 22 04:00:34 np0005591762 dracut[1243]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 04:00:34 np0005591762 dracut[1243]: memstrack is not available
Jan 22 04:00:34 np0005591762 dracut[1243]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 04:00:35 np0005591762 dracut[1243]: *** Including module: systemd ***
Jan 22 04:00:35 np0005591762 dracut[1243]: *** Including module: fips ***
Jan 22 04:00:35 np0005591762 dracut[1243]: *** Including module: systemd-initrd ***
Jan 22 04:00:35 np0005591762 dracut[1243]: *** Including module: i18n ***
Jan 22 04:00:35 np0005591762 dracut[1243]: *** Including module: drm ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: prefixdevname ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: kernel-modules ***
Jan 22 04:00:36 np0005591762 kernel: block vda: the capability attribute has been deprecated.
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: kernel-modules-extra ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: qemu ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: fstab-sys ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: rootfs-block ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: terminfo ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: udev-rules ***
Jan 22 04:00:36 np0005591762 dracut[1243]: Skipping udev rule: 91-permissions.rules
Jan 22 04:00:36 np0005591762 dracut[1243]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: virtiofs ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: dracut-systemd ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: usrmount ***
Jan 22 04:00:36 np0005591762 dracut[1243]: *** Including module: base ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including module: fs-lib ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including module: kdumpbase ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 22 04:00:37 np0005591762 dracut[1243]:  microcode_ctl module: mangling fw_dir
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 22 04:00:37 np0005591762 irqbalance[740]: Cannot change IRQ 45 affinity: Operation not permitted
Jan 22 04:00:37 np0005591762 irqbalance[740]: IRQ 45 affinity is now unmanaged
Jan 22 04:00:37 np0005591762 irqbalance[740]: Cannot change IRQ 48 affinity: Operation not permitted
Jan 22 04:00:37 np0005591762 irqbalance[740]: IRQ 48 affinity is now unmanaged
Jan 22 04:00:37 np0005591762 irqbalance[740]: Cannot change IRQ 46 affinity: Operation not permitted
Jan 22 04:00:37 np0005591762 irqbalance[740]: IRQ 46 affinity is now unmanaged
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 22 04:00:37 np0005591762 dracut[1243]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including module: openssl ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including module: shutdown ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including module: squash ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Including modules done ***
Jan 22 04:00:37 np0005591762 dracut[1243]: *** Installing kernel module dependencies ***
Jan 22 04:00:38 np0005591762 dracut[1243]: *** Installing kernel module dependencies done ***
Jan 22 04:00:38 np0005591762 dracut[1243]: *** Resolving executable dependencies ***
Jan 22 04:00:39 np0005591762 dracut[1243]: *** Resolving executable dependencies done ***
Jan 22 04:00:39 np0005591762 dracut[1243]: *** Generating early-microcode cpio image ***
Jan 22 04:00:39 np0005591762 dracut[1243]: *** Store current command line parameters ***
Jan 22 04:00:39 np0005591762 dracut[1243]: Stored kernel commandline:
Jan 22 04:00:39 np0005591762 dracut[1243]: No dracut internal kernel commandline stored in the initramfs
Jan 22 04:00:39 np0005591762 dracut[1243]: *** Install squash loader ***
Jan 22 04:00:40 np0005591762 dracut[1243]: *** Squashing the files inside the initramfs ***
Jan 22 04:00:41 np0005591762 dracut[1243]: *** Squashing the files inside the initramfs done ***
Jan 22 04:00:41 np0005591762 dracut[1243]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 22 04:00:41 np0005591762 dracut[1243]: *** Hardlinking files ***
Jan 22 04:00:41 np0005591762 dracut[1243]: *** Hardlinking files done ***
Jan 22 04:00:42 np0005591762 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 04:00:42 np0005591762 dracut[1243]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 22 04:00:42 np0005591762 kdumpctl[975]: kdump: kexec: loaded kdump kernel
Jan 22 04:00:42 np0005591762 kdumpctl[975]: kdump: Starting kdump: [OK]
Jan 22 04:00:42 np0005591762 systemd[1]: Finished Crash recovery kernel arming.
Jan 22 04:00:42 np0005591762 systemd[1]: Startup finished in 1.361s (kernel) + 2.079s (initrd) + 16.628s (userspace) = 20.069s.
Jan 22 04:00:57 np0005591762 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 04:01:58 np0005591762 systemd[1]: Created slice User Slice of UID 1000.
Jan 22 04:01:58 np0005591762 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 22 04:01:58 np0005591762 systemd-logind[744]: New session 1 of user zuul.
Jan 22 04:01:58 np0005591762 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 22 04:01:58 np0005591762 systemd[1]: Starting User Manager for UID 1000...
Jan 22 04:01:59 np0005591762 systemd[4396]: Queued start job for default target Main User Target.
Jan 22 04:01:59 np0005591762 systemd[4396]: Created slice User Application Slice.
Jan 22 04:01:59 np0005591762 systemd[4396]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 04:01:59 np0005591762 systemd[4396]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 04:01:59 np0005591762 systemd[4396]: Reached target Paths.
Jan 22 04:01:59 np0005591762 systemd[4396]: Reached target Timers.
Jan 22 04:01:59 np0005591762 systemd[4396]: Starting D-Bus User Message Bus Socket...
Jan 22 04:01:59 np0005591762 systemd[4396]: Starting Create User's Volatile Files and Directories...
Jan 22 04:01:59 np0005591762 systemd[4396]: Listening on D-Bus User Message Bus Socket.
Jan 22 04:01:59 np0005591762 systemd[4396]: Finished Create User's Volatile Files and Directories.
Jan 22 04:01:59 np0005591762 systemd[4396]: Reached target Sockets.
Jan 22 04:01:59 np0005591762 systemd[4396]: Reached target Basic System.
Jan 22 04:01:59 np0005591762 systemd[4396]: Reached target Main User Target.
Jan 22 04:01:59 np0005591762 systemd[4396]: Startup finished in 78ms.
Jan 22 04:01:59 np0005591762 systemd[1]: Started User Manager for UID 1000.
Jan 22 04:01:59 np0005591762 systemd[1]: Started Session 1 of User zuul.
Jan 22 04:01:59 np0005591762 python3[4478]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:02:02 np0005591762 python3[4506]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:02:07 np0005591762 python3[4560]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:02:07 np0005591762 python3[4600]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 22 04:02:09 np0005591762 python3[4626]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDlMRuJHMYjRIwgFF7czfU/mQMp6kd/gekboAQOEyPZmFQ60ialgxg9ko3arflxh6BUDY9IRw5tg9Bc05rdPoHNVoypQr/DoxSfvsU84qPJm+lDycIqATeh/aT4guxaryYYTWBZ4qDDNJ35iJKBI+7e8DCUN5iq/pdAbiNSXUkQ/mE/YyPwnoX/VfLmek3usQJ/7ks+f6SDf9imXAGwT8SyPYwF+zBEuiCwyHajZ7DAyPYxASuh7iKE6DtE4RAjr3e6tw4K+9sA35hpbH+WT9EhJpLUdfBpl/QToPLojuyCl4dAuCl95OwtPOeUYqdk+JFHXpD/37JeXcYPjNEoLM8nt6W20iBSKdTVjXU5ZDirWEMkSGLei0FtsZXsdLvA/YQSMBlGd9t1Ex6YkkmpgrcuppALH+M1an0gLxQnL4d1uQWn8dD3uwJfOw5KbMPjT2zVrTvRc2SpKcEsAiyiqYXq45wiJmyMXbJHUeTJ8OIbMjvRn3iwGQr3A/Hpddfw/E= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:09 np0005591762 python3[4650]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:09 np0005591762 python3[4749]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:09 np0005591762 python3[4820]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769072529.5296748-254-128615302649766/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=cf287e8594d94086a6e561a1533072f0_id_rsa follow=False checksum=5fe97cb15fc153784845243a4fc540b8e8f96206 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:10 np0005591762 python3[4943]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:10 np0005591762 python3[5014]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769072530.1878366-309-188106043135444/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=cf287e8594d94086a6e561a1533072f0_id_rsa.pub follow=False checksum=31c396396a28f842c445fba0dac187f292042cc5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:11 np0005591762 python3[5062]: ansible-ping Invoked with data=pong
Jan 22 04:02:12 np0005591762 python3[5086]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:02:13 np0005591762 python3[5140]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 22 04:02:14 np0005591762 python3[5172]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:14 np0005591762 python3[5196]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:15 np0005591762 python3[5220]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:15 np0005591762 python3[5244]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:15 np0005591762 python3[5268]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:15 np0005591762 python3[5292]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:17 np0005591762 python3[5318]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:17 np0005591762 python3[5396]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:17 np0005591762 python3[5469]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769072537.1626637-34-92913706358976/source follow=False _original_basename=mirror_info.sh.j2 checksum=3f92644b791816833989d215b9a84c589a7b8ebd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:18 np0005591762 python3[5517]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:18 np0005591762 python3[5541]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:18 np0005591762 python3[5565]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:18 np0005591762 python3[5589]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:19 np0005591762 python3[5613]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:19 np0005591762 python3[5637]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:19 np0005591762 python3[5661]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:19 np0005591762 python3[5685]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:19 np0005591762 python3[5709]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:19 np0005591762 python3[5733]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:20 np0005591762 python3[5757]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:20 np0005591762 python3[5781]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:20 np0005591762 python3[5805]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:20 np0005591762 python3[5829]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:20 np0005591762 python3[5853]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:21 np0005591762 python3[5877]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:21 np0005591762 python3[5901]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:21 np0005591762 python3[5925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:21 np0005591762 python3[5949]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:21 np0005591762 python3[5973]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:22 np0005591762 python3[5997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:22 np0005591762 python3[6021]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:22 np0005591762 python3[6045]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:22 np0005591762 python3[6069]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:22 np0005591762 python3[6093]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:23 np0005591762 python3[6117]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:02:25 np0005591762 python3[6143]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 04:02:25 np0005591762 systemd[1]: Starting Time & Date Service...
Jan 22 04:02:25 np0005591762 systemd[1]: Started Time & Date Service.
Jan 22 04:02:25 np0005591762 systemd-timedated[6145]: Changed time zone to 'UTC' (UTC).
Jan 22 04:02:27 np0005591762 python3[6174]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:27 np0005591762 python3[6250]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:27 np0005591762 python3[6321]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769072547.3021913-255-207000542105446/source _original_basename=tmp7sm9z_8_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:28 np0005591762 python3[6421]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:28 np0005591762 python3[6492]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769072547.9279027-305-278389424043461/source _original_basename=tmpho9par7l follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:29 np0005591762 python3[6594]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:29 np0005591762 python3[6667]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769072548.8244605-385-172344997011876/source _original_basename=tmpjeko_lxd follow=False checksum=0b70de0bdd82b352a09b4d5c05bc3aec934661b8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:29 np0005591762 python3[6715]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:02:29 np0005591762 python3[6741]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:02:30 np0005591762 python3[6821]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:02:30 np0005591762 python3[6894]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769072549.9876113-455-24549570648853/source _original_basename=tmpagbjpmk1 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:30 np0005591762 python3[6945]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e4f-9ce5-4910-b3ff-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:02:31 np0005591762 python3[6973]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e4f-9ce5-4910-b3ff-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 22 04:02:32 np0005591762 python3[7001]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:37 np0005591762 irqbalance[740]: Cannot change IRQ 47 affinity: Operation not permitted
Jan 22 04:02:37 np0005591762 irqbalance[740]: IRQ 47 affinity is now unmanaged
Jan 22 04:02:48 np0005591762 python3[7027]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:02:55 np0005591762 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 04:03:48 np0005591762 systemd-logind[744]: Session 1 logged out. Waiting for processes to exit.
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: ROM [mem 0x00000000-0x0003ffff pref]
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: ROM [mem 0xfe000000-0xfe03ffff pref]: assigned
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfb600000-0xfb603fff 64bit pref]: assigned
Jan 22 04:03:51 np0005591762 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfe040000-0xfe040fff]: assigned
Jan 22 04:03:51 np0005591762 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002)
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.1922] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 04:03:51 np0005591762 systemd-udevd[7031]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2041] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2060] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2064] device (eth1): carrier: link connected
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2066] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2071] policy: auto-activating connection 'Wired connection 1' (795c5902-e4d2-3a00-9d45-2dfdafd10a3d)
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2075] device (eth1): Activation: starting connection 'Wired connection 1' (795c5902-e4d2-3a00-9d45-2dfdafd10a3d)
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2075] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2079] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2083] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:03:51 np0005591762 NetworkManager[813]: <info>  [1769072631.2088] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:03:51 np0005591762 systemd-logind[744]: New session 3 of user zuul.
Jan 22 04:03:51 np0005591762 systemd[1]: Started Session 3 of User zuul.
Jan 22 04:03:51 np0005591762 python3[7061]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e4f-9ce5-e59a-b3b6-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:04:00 np0005591762 systemd[4396]: Starting Mark boot as successful...
Jan 22 04:04:00 np0005591762 systemd[4396]: Finished Mark boot as successful.
Jan 22 04:04:01 np0005591762 python3[7142]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:04:01 np0005591762 python3[7215]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769072641.2826722-212-36728720974954/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=5bec32df5bc1d09bd6b4e82ccce6ad22a876a1c3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:04:02 np0005591762 python3[7265]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:04:02 np0005591762 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 04:04:02 np0005591762 systemd[1]: Stopped Network Manager Wait Online.
Jan 22 04:04:02 np0005591762 systemd[1]: Stopping Network Manager Wait Online...
Jan 22 04:04:02 np0005591762 systemd[1]: Stopping Network Manager...
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0520] caught SIGTERM, shutting down normally.
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0531] dhcp4 (eth0): canceled DHCP transaction
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0531] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0531] dhcp4 (eth0): state changed no lease
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0533] dhcp6 (eth0): canceled DHCP transaction
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0533] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0533] dhcp6 (eth0): state changed no lease
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0536] manager: NetworkManager state is now CONNECTING
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0565] dhcp4 (eth1): canceled DHCP transaction
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0565] dhcp4 (eth1): state changed no lease
Jan 22 04:04:02 np0005591762 NetworkManager[813]: <info>  [1769072642.0586] exiting (success)
Jan 22 04:04:02 np0005591762 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 04:04:02 np0005591762 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 04:04:02 np0005591762 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 04:04:02 np0005591762 systemd[1]: Stopped Network Manager.
Jan 22 04:04:02 np0005591762 systemd[1]: Starting Network Manager...
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1032] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:8b9e49f6-dfde-4886-8f0c-7f0567b85e9e)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1032] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1069] manager[0x55970d4d2000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 04:04:02 np0005591762 systemd[1]: Starting Hostname Service...
Jan 22 04:04:02 np0005591762 systemd[1]: Started Hostname Service.
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1576] hostname: hostname: using hostnamed
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1577] hostname: static hostname changed from (none) to "np0005591762"
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1579] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1581] manager[0x55970d4d2000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1581] manager[0x55970d4d2000]: rfkill: WWAN hardware radio set enabled
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1598] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1599] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1599] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1599] manager: Networking is enabled by state file
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1600] settings: Loaded settings plugin: keyfile (internal)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1603] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1619] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1625] dhcp: init: Using DHCP client 'internal'
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1626] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1630] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1633] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1638] device (lo): Activation: starting connection 'lo' (8f4da813-534e-4822-a9a0-9dc45c872492)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1642] device (eth0): carrier: link connected
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1645] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1647] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1648] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1651] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1655] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1658] device (eth1): carrier: link connected
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1661] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1663] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (795c5902-e4d2-3a00-9d45-2dfdafd10a3d) (indicated)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1663] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1666] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1669] device (eth1): Activation: starting connection 'Wired connection 1' (795c5902-e4d2-3a00-9d45-2dfdafd10a3d)
Jan 22 04:04:02 np0005591762 systemd[1]: Started Network Manager.
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1686] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1688] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1691] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1691] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1692] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1694] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1695] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1697] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1698] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1701] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1709] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1712] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1715] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1723] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 22 04:04:02 np0005591762 systemd[1]: Starting Network Manager Wait Online...
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1726] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1729] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1732] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1735] device (lo): Activation: successful, device activated.
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1740] dhcp4 (eth0): state changed new lease, address=192.168.26.49
Jan 22 04:04:02 np0005591762 NetworkManager[7277]: <info>  [1769072642.1744] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 04:04:02 np0005591762 python3[7337]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e4f-9ce5-e59a-b3b6-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2693] dhcp6 (eth0): state changed new lease, address=2001:db8::1cd
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2700] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2723] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2725] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2727] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2728] device (eth0): Activation: successful, device activated.
Jan 22 04:04:03 np0005591762 NetworkManager[7277]: <info>  [1769072643.2732] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 04:04:13 np0005591762 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 04:04:32 np0005591762 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.3995] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 04:04:47 np0005591762 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 04:04:47 np0005591762 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4242] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4244] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4247] device (eth1): Activation: successful, device activated.
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4251] manager: startup complete
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4252] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <warn>  [1769072687.4254] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4258] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 systemd[1]: Finished Network Manager Wait Online.
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4314] dhcp4 (eth1): canceled DHCP transaction
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4315] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4315] dhcp4 (eth1): state changed no lease
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4323] policy: auto-activating connection 'ci-private-network' (7f74ce3a-a81b-54b7-b052-8987ea8817f8)
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4327] device (eth1): Activation: starting connection 'ci-private-network' (7f74ce3a-a81b-54b7-b052-8987ea8817f8)
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4327] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4329] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4333] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4339] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4362] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4363] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:04:47 np0005591762 NetworkManager[7277]: <info>  [1769072687.4366] device (eth1): Activation: successful, device activated.
Jan 22 04:04:51 np0005591762 python3[7461]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:04:51 np0005591762 python3[7534]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769072691.0490465-379-243800743804978/source _original_basename=tmp3el6ycdm follow=False checksum=4107463a008ffd1ee3e83966dce6317b3b41e8b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:04:52 np0005591762 systemd[1]: session-3.scope: Deactivated successfully.
Jan 22 04:04:52 np0005591762 systemd[1]: session-3.scope: Consumed 1.488s CPU time.
Jan 22 04:04:52 np0005591762 systemd-logind[744]: Session 3 logged out. Waiting for processes to exit.
Jan 22 04:04:52 np0005591762 systemd-logind[744]: Removed session 3.
Jan 22 04:04:57 np0005591762 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 04:07:00 np0005591762 systemd[4396]: Created slice User Background Tasks Slice.
Jan 22 04:07:00 np0005591762 systemd[4396]: Starting Cleanup of User's Temporary Files and Directories...
Jan 22 04:07:00 np0005591762 systemd[4396]: Finished Cleanup of User's Temporary Files and Directories.
Jan 22 04:09:45 np0005591762 systemd-logind[744]: New session 4 of user zuul.
Jan 22 04:09:45 np0005591762 systemd[1]: Started Session 4 of User zuul.
Jan 22 04:09:45 np0005591762 python3[7591]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e4f-9ce5-b8c1-a785-00000000216f-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:09:46 np0005591762 python3[7620]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:09:46 np0005591762 python3[7646]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:09:46 np0005591762 python3[7672]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:09:46 np0005591762 python3[7698]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:09:47 np0005591762 python3[7724]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:09:47 np0005591762 python3[7802]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:09:48 np0005591762 python3[7875]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769072987.634621-542-9535538305397/source _original_basename=tmpsne6jvdr follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:09:49 np0005591762 python3[7925]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:09:49 np0005591762 systemd[1]: Reloading.
Jan 22 04:09:49 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:09:50 np0005591762 python3[7981]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 22 04:09:50 np0005591762 python3[8007]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:09:50 np0005591762 python3[8035]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:09:51 np0005591762 python3[8063]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:09:51 np0005591762 python3[8091]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:09:52 np0005591762 python3[8118]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e4f-9ce5-b8c1-a785-000000002176-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:09:52 np0005591762 python3[8148]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 22 04:09:55 np0005591762 systemd[1]: session-4.scope: Deactivated successfully.
Jan 22 04:09:55 np0005591762 systemd[1]: session-4.scope: Consumed 2.862s CPU time.
Jan 22 04:09:55 np0005591762 systemd-logind[744]: Session 4 logged out. Waiting for processes to exit.
Jan 22 04:09:55 np0005591762 systemd-logind[744]: Removed session 4.
Jan 22 04:09:56 np0005591762 systemd-logind[744]: New session 5 of user zuul.
Jan 22 04:09:56 np0005591762 systemd[1]: Started Session 5 of User zuul.
Jan 22 04:09:56 np0005591762 python3[8182]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 22 04:10:05 np0005591762 setsebool[8225]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 22 04:10:05 np0005591762 setsebool[8225]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 22 04:10:13 np0005591762 kernel: SELinux:  Converting 387 SID table entries...
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:10:13 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:10:18 np0005591762 chronyd[753]: Selected source 159.203.82.102 (2.centos.pool.ntp.org)
Jan 22 04:10:20 np0005591762 kernel: SELinux:  Converting 390 SID table entries...
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:10:20 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:10:33 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 04:10:33 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:10:33 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:10:33 np0005591762 systemd[1]: Reloading.
Jan 22 04:10:33 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:10:33 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:10:38 np0005591762 python3[14567]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e4f-9ce5-c8dd-b18c-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:10:38 np0005591762 kernel: evm: overlay not supported
Jan 22 04:10:38 np0005591762 systemd[4396]: Starting D-Bus User Message Bus...
Jan 22 04:10:38 np0005591762 dbus-broker-launch[15314]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 22 04:10:38 np0005591762 dbus-broker-launch[15314]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 22 04:10:38 np0005591762 systemd[4396]: Started D-Bus User Message Bus.
Jan 22 04:10:38 np0005591762 dbus-broker-lau[15314]: Ready
Jan 22 04:10:38 np0005591762 systemd[4396]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 04:10:38 np0005591762 systemd[4396]: Created slice Slice /user.
Jan 22 04:10:38 np0005591762 systemd[4396]: podman-15247.scope: unit configures an IP firewall, but not running as root.
Jan 22 04:10:38 np0005591762 systemd[4396]: (This warning is only shown for the first unit using IP firewalling.)
Jan 22 04:10:38 np0005591762 systemd[4396]: Started podman-15247.scope.
Jan 22 04:10:39 np0005591762 systemd[4396]: Started podman-pause-22c35cdb.scope.
Jan 22 04:10:39 np0005591762 systemd[1]: session-5.scope: Deactivated successfully.
Jan 22 04:10:39 np0005591762 systemd[1]: session-5.scope: Consumed 30.572s CPU time.
Jan 22 04:10:40 np0005591762 systemd-logind[744]: Session 5 logged out. Waiting for processes to exit.
Jan 22 04:10:40 np0005591762 systemd-logind[744]: Removed session 5.
Jan 22 04:10:57 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:10:57 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:10:57 np0005591762 systemd[1]: man-db-cache-update.service: Consumed 30.460s CPU time.
Jan 22 04:10:57 np0005591762 systemd[1]: run-r53571e59970a4b56b7916dc94c5a6bf5.service: Deactivated successfully.
Jan 22 04:11:05 np0005591762 systemd-logind[744]: New session 6 of user zuul.
Jan 22 04:11:05 np0005591762 systemd[1]: Started Session 6 of User zuul.
Jan 22 04:11:05 np0005591762 python3[29665]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQ7acfP3+x1cL69/ksfFQyqWrZMrgUMZrDd0CF5wqelljlFTSN4wbIyelUw7pI21/LAuvact75cdTAckFyYsBw= zuul@np0005591759#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:11:06 np0005591762 python3[29691]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQ7acfP3+x1cL69/ksfFQyqWrZMrgUMZrDd0CF5wqelljlFTSN4wbIyelUw7pI21/LAuvact75cdTAckFyYsBw= zuul@np0005591759#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:11:06 np0005591762 python3[29717]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005591762 update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 22 04:11:07 np0005591762 python3[29751]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQ7acfP3+x1cL69/ksfFQyqWrZMrgUMZrDd0CF5wqelljlFTSN4wbIyelUw7pI21/LAuvact75cdTAckFyYsBw= zuul@np0005591759#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 04:11:07 np0005591762 python3[29829]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:11:07 np0005591762 python3[29902]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769073067.3065283-155-158198456486339/source _original_basename=tmpl5thop1w follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:11:08 np0005591762 python3[29952]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 22 04:11:08 np0005591762 systemd[1]: Starting Hostname Service...
Jan 22 04:11:08 np0005591762 systemd[1]: Started Hostname Service.
Jan 22 04:11:08 np0005591762 systemd-hostnamed[29956]: Changed pretty hostname to 'compute-2'
Jan 22 04:11:08 np0005591762 systemd-hostnamed[29956]: Hostname set to <compute-2> (static)
Jan 22 04:11:08 np0005591762 NetworkManager[7277]: <info>  [1769073068.5702] hostname: static hostname changed from "np0005591762" to "compute-2"
Jan 22 04:11:08 np0005591762 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 04:11:08 np0005591762 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 04:11:08 np0005591762 systemd-logind[744]: Session 6 logged out. Waiting for processes to exit.
Jan 22 04:11:08 np0005591762 systemd[1]: session-6.scope: Deactivated successfully.
Jan 22 04:11:08 np0005591762 systemd[1]: session-6.scope: Consumed 1.626s CPU time.
Jan 22 04:11:08 np0005591762 systemd-logind[744]: Removed session 6.
Jan 22 04:11:18 np0005591762 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 04:11:38 np0005591762 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 04:14:21 np0005591762 systemd-logind[744]: New session 7 of user zuul.
Jan 22 04:14:21 np0005591762 systemd[1]: Started Session 7 of User zuul.
Jan 22 04:14:21 np0005591762 python3[30050]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:14:23 np0005591762 python3[30162]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:23 np0005591762 python3[30235]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=delorean.repo follow=False checksum=1d7412093fdea43b5454099227a576288791d9ce backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:23 np0005591762 python3[30261]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:23 np0005591762 python3[30334]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=50a3fd92f8bf68f65d4644f7ea4a784e3eaa0ad5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:24 np0005591762 python3[30360]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:24 np0005591762 python3[30433]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=8163d09913b97597f86e38eb45c3003e91da783e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:24 np0005591762 python3[30459]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:24 np0005591762 python3[30532]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=d108d0750ad5b288ccc41bc6534ea307cc51e987 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:24 np0005591762 python3[30558]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:25 np0005591762 python3[30631]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=20c3917c672c059a872cf09a437f61890d2f89fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:25 np0005591762 python3[30657]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:25 np0005591762 python3[30730]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=4d14f168e8a0e6930d905faffbcdf4fedd6664d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:25 np0005591762 python3[30756]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:14:26 np0005591762 python3[30829]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769073262.9258723-34405-116900834947032/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:14:35 np0005591762 python3[30877]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:16:00 np0005591762 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 22 04:16:00 np0005591762 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 22 04:16:00 np0005591762 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 22 04:16:00 np0005591762 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 22 04:19:35 np0005591762 systemd[1]: session-7.scope: Deactivated successfully.
Jan 22 04:19:35 np0005591762 systemd[1]: session-7.scope: Consumed 3.651s CPU time.
Jan 22 04:19:35 np0005591762 systemd-logind[744]: Session 7 logged out. Waiting for processes to exit.
Jan 22 04:19:35 np0005591762 systemd-logind[744]: Removed session 7.
Jan 22 04:24:59 np0005591762 systemd-logind[744]: New session 8 of user zuul.
Jan 22 04:24:59 np0005591762 systemd[1]: Started Session 8 of User zuul.
Jan 22 04:25:00 np0005591762 python3.9[31038]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:25:01 np0005591762 python3.9[31219]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:25:09 np0005591762 systemd[1]: session-8.scope: Deactivated successfully.
Jan 22 04:25:09 np0005591762 systemd[1]: session-8.scope: Consumed 6.131s CPU time.
Jan 22 04:25:09 np0005591762 systemd-logind[744]: Session 8 logged out. Waiting for processes to exit.
Jan 22 04:25:09 np0005591762 systemd-logind[744]: Removed session 8.
Jan 22 04:25:24 np0005591762 systemd-logind[744]: New session 9 of user zuul.
Jan 22 04:25:24 np0005591762 systemd[1]: Started Session 9 of User zuul.
Jan 22 04:25:25 np0005591762 python3.9[31429]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 22 04:25:26 np0005591762 python3.9[31603]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:25:27 np0005591762 python3.9[31755]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:25:27 np0005591762 python3.9[31908]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:25:28 np0005591762 python3.9[32060]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:25:29 np0005591762 python3.9[32212]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:25:29 np0005591762 python3.9[32335]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769073928.6857715-174-157552698868854/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:25:30 np0005591762 python3.9[32487]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:25:30 np0005591762 python3.9[32643]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:25:31 np0005591762 python3.9[32795]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:25:31 np0005591762 python3.9[32945]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:25:33 np0005591762 python3.9[33198]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:25:34 np0005591762 python3.9[33348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:25:35 np0005591762 python3.9[33502]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:25:36 np0005591762 python3.9[33660]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:25:36 np0005591762 python3.9[33744]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:26:47 np0005591762 systemd[1]: Starting dnf makecache...
Jan 22 04:26:47 np0005591762 systemd[1]: Reloading.
Jan 22 04:26:47 np0005591762 dnf[33915]: Failed determining last makecache time.
Jan 22 04:26:47 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:26:47 np0005591762 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 22 04:26:47 np0005591762 dnf[33915]: delorean-openstack-barbican-42b4c41831408a8e323  21 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 systemd[1]: Reloading.
Jan 22 04:26:48 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-openstack-cinder-1c00d6490d88e436f26ef  23 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 22 04:26:48 np0005591762 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-python-stevedore-c4acc5639fd2329372142  21 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 systemd[1]: Reloading.
Jan 22 04:26:48 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-python-cloudkitty-tests-tempest-2c80f8  21 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 22 04:26:48 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:26:48 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:26:48 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-os-refresh-config-9bfc52b5049be2d8de61  23 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:48 np0005591762 dnf[33915]: delorean-python-designate-tests-tempest-347fdbc  23 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-openstack-glance-1fd12c29b339f30fe823e  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-openstack-keystone-e4b40af0ae3698fbbbb  23 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-openstack-manila-3c01b7181572c95dac462  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-python-whitebox-neutron-tests-tempest-  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-openstack-octavia-ba397f07a7331190208c  21 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-openstack-watcher-c014f81a8647287f6dcc  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:49 np0005591762 dnf[33915]: delorean-ansible-config_template-5ccaa22121a7ff  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:50 np0005591762 dnf[33915]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158  21 kB/s | 3.0 kB     00:00
Jan 22 04:26:50 np0005591762 dnf[33915]: delorean-openstack-swift-dc98a8463506ac520c469a  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:50 np0005591762 dnf[33915]: delorean-python-tempestconf-8515371b7cceebd4282  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:50 np0005591762 dnf[33915]: delorean-openstack-heat-ui-013accbfd179753bc3f0  22 kB/s | 3.0 kB     00:00
Jan 22 04:26:51 np0005591762 dnf[33915]: CentOS Stream 9 - BaseOS                         12 kB/s | 6.7 kB     00:00
Jan 22 04:26:51 np0005591762 dnf[33915]: CentOS Stream 9 - AppStream                      16 kB/s | 6.8 kB     00:00
Jan 22 04:26:52 np0005591762 dnf[33915]: CentOS Stream 9 - CRB                            14 kB/s | 6.6 kB     00:00
Jan 22 04:26:52 np0005591762 dnf[33915]: CentOS Stream 9 - Extras packages                12 kB/s | 7.3 kB     00:00
Jan 22 04:26:52 np0005591762 dnf[33915]: dlrn-antelope-testing                            22 kB/s | 3.0 kB     00:00
Jan 22 04:26:52 np0005591762 dnf[33915]: dlrn-antelope-build-deps                         22 kB/s | 3.0 kB     00:00
Jan 22 04:26:55 np0005591762 dnf[33915]: centos9-rabbitmq                                1.1 kB/s | 3.0 kB     00:02
Jan 22 04:26:57 np0005591762 dnf[33915]: centos9-storage                                 2.1 kB/s | 3.0 kB     00:01
Jan 22 04:26:57 np0005591762 dnf[33915]: centos9-opstools                                7.0 kB/s | 3.0 kB     00:00
Jan 22 04:26:57 np0005591762 dnf[33915]: NFV SIG OpenvSwitch                             7.0 kB/s | 3.0 kB     00:00
Jan 22 04:26:58 np0005591762 dnf[33915]: repo-setup-centos-appstream                      10 kB/s | 4.4 kB     00:00
Jan 22 04:26:59 np0005591762 dnf[33915]: repo-setup-centos-baseos                        3.2 kB/s | 3.9 kB     00:01
Jan 22 04:27:01 np0005591762 dnf[33915]: repo-setup-centos-highavailability              2.7 kB/s | 3.9 kB     00:01
Jan 22 04:27:01 np0005591762 dnf[33915]: repo-setup-centos-powertools                     10 kB/s | 4.3 kB     00:00
Jan 22 04:27:04 np0005591762 dnf[33915]: Extra Packages for Enterprise Linux 9 - x86_64   10 kB/s |  25 kB     00:02
Jan 22 04:27:04 np0005591762 dnf[33915]: Metadata cache created.
Jan 22 04:27:04 np0005591762 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 04:27:04 np0005591762 systemd[1]: Finished dnf makecache.
Jan 22 04:27:04 np0005591762 systemd[1]: dnf-makecache.service: Consumed 1.344s CPU time.
Jan 22 04:27:32 np0005591762 kernel: SELinux:  Converting 2724 SID table entries...
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:27:32 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:27:32 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 22 04:27:32 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:27:32 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:27:32 np0005591762 systemd[1]: Reloading.
Jan 22 04:27:32 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:27:33 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:27:33 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:27:33 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:27:33 np0005591762 systemd[1]: run-rf8c99102bafb4bb088de2073ff8b6207.service: Deactivated successfully.
Jan 22 04:27:35 np0005591762 python3.9[35287]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:27:37 np0005591762 python3.9[35568]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 22 04:27:38 np0005591762 python3.9[35720]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 22 04:27:40 np0005591762 python3.9[35873]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:27:40 np0005591762 python3.9[36025]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 22 04:27:42 np0005591762 python3.9[36177]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:27:42 np0005591762 python3.9[36329]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:27:42 np0005591762 python3.9[36452]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074062.221014-664-103738642660671/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:27:45 np0005591762 python3.9[36604]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:27:46 np0005591762 python3.9[36756]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:27:47 np0005591762 python3.9[36909]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:27:48 np0005591762 python3.9[37062]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 22 04:27:51 np0005591762 python3.9[37215]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 04:27:51 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:27:51 np0005591762 python3.9[37374]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 04:27:52 np0005591762 python3.9[37534]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 22 04:27:52 np0005591762 python3.9[37687]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 04:27:53 np0005591762 python3.9[37845]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 22 04:27:54 np0005591762 python3.9[37997]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:27:55 np0005591762 python3.9[38150]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:27:56 np0005591762 python3.9[38302]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:27:56 np0005591762 python3.9[38425]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074076.0973873-1021-126712174892418/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:27:57 np0005591762 python3.9[38577]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:27:57 np0005591762 systemd[1]: Starting Load Kernel Modules...
Jan 22 04:27:57 np0005591762 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 22 04:27:57 np0005591762 systemd-modules-load[38581]: Inserted module 'br_netfilter'
Jan 22 04:27:57 np0005591762 kernel: Bridge firewalling registered
Jan 22 04:27:57 np0005591762 systemd[1]: Finished Load Kernel Modules.
Jan 22 04:27:58 np0005591762 python3.9[38736]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:27:58 np0005591762 python3.9[38859]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074077.872785-1090-221928762784347/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:27:59 np0005591762 python3.9[39011]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:28:05 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:28:05 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:28:05 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:28:05 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:28:05 np0005591762 systemd[1]: Reloading.
Jan 22 04:28:05 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:28:05 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:28:06 np0005591762 python3.9[40646]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:28:07 np0005591762 python3.9[41633]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 22 04:28:07 np0005591762 python3.9[42484]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:28:07 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:28:07 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:28:07 np0005591762 systemd[1]: man-db-cache-update.service: Consumed 3.138s CPU time.
Jan 22 04:28:07 np0005591762 systemd[1]: run-ra20131cfebf24812b75634a6c7956d4a.service: Deactivated successfully.
Jan 22 04:28:08 np0005591762 python3.9[43181]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:28:08 np0005591762 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 04:28:08 np0005591762 systemd[1]: Starting Authorization Manager...
Jan 22 04:28:08 np0005591762 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 04:28:08 np0005591762 polkitd[43398]: Started polkitd version 0.117
Jan 22 04:28:08 np0005591762 systemd[1]: Started Authorization Manager.
Jan 22 04:28:09 np0005591762 python3.9[43564]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:28:09 np0005591762 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 22 04:28:09 np0005591762 systemd[1]: tuned.service: Deactivated successfully.
Jan 22 04:28:09 np0005591762 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 22 04:28:09 np0005591762 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 04:28:09 np0005591762 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 04:28:10 np0005591762 python3.9[43726]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 22 04:28:12 np0005591762 python3.9[43878]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:28:12 np0005591762 systemd[1]: Reloading.
Jan 22 04:28:12 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:28:13 np0005591762 python3.9[44066]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:28:13 np0005591762 systemd[1]: Reloading.
Jan 22 04:28:13 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:28:14 np0005591762 python3.9[44255]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:28:14 np0005591762 python3.9[44408]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:28:14 np0005591762 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 22 04:28:15 np0005591762 python3.9[44561]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:28:16 np0005591762 python3.9[44723]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:28:17 np0005591762 python3.9[44876]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:28:17 np0005591762 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 04:28:17 np0005591762 systemd[1]: Stopped Apply Kernel Variables.
Jan 22 04:28:17 np0005591762 systemd[1]: Stopping Apply Kernel Variables...
Jan 22 04:28:17 np0005591762 systemd[1]: Starting Apply Kernel Variables...
Jan 22 04:28:17 np0005591762 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 04:28:17 np0005591762 systemd[1]: Finished Apply Kernel Variables.
Jan 22 04:28:17 np0005591762 systemd[1]: session-9.scope: Deactivated successfully.
Jan 22 04:28:17 np0005591762 systemd[1]: session-9.scope: Consumed 1min 39.418s CPU time.
Jan 22 04:28:17 np0005591762 systemd-logind[744]: Session 9 logged out. Waiting for processes to exit.
Jan 22 04:28:17 np0005591762 systemd-logind[744]: Removed session 9.
Jan 22 04:28:22 np0005591762 systemd-logind[744]: New session 10 of user zuul.
Jan 22 04:28:22 np0005591762 systemd[1]: Started Session 10 of User zuul.
Jan 22 04:28:23 np0005591762 python3.9[45060]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:28:24 np0005591762 python3.9[45216]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 22 04:28:25 np0005591762 python3.9[45369]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 04:28:25 np0005591762 python3.9[45527]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 04:28:26 np0005591762 python3.9[45687]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:28:27 np0005591762 python3.9[45771]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 04:28:33 np0005591762 python3.9[45935]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:28:41 np0005591762 kernel: SELinux:  Converting 2736 SID table entries...
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:28:41 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:28:41 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 22 04:28:41 np0005591762 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 22 04:28:42 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:28:42 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:28:42 np0005591762 systemd[1]: Reloading.
Jan 22 04:28:42 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:28:42 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:28:42 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:28:43 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:28:43 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:28:43 np0005591762 systemd[1]: run-rb0d8a88ffa0c46cd9be7308f830a430d.service: Deactivated successfully.
Jan 22 04:28:45 np0005591762 python3.9[47032]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:28:45 np0005591762 systemd[1]: Reloading.
Jan 22 04:28:45 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:28:45 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:28:45 np0005591762 systemd[1]: Starting Open vSwitch Database Unit...
Jan 22 04:28:45 np0005591762 chown[47074]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 22 04:28:45 np0005591762 ovs-ctl[47079]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 22 04:28:45 np0005591762 ovs-ctl[47079]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 22 04:28:45 np0005591762 ovs-ctl[47079]: Starting ovsdb-server [  OK  ]
Jan 22 04:28:45 np0005591762 ovs-vsctl[47128]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 22 04:28:45 np0005591762 ovs-vsctl[47148]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"61e0485d-79f8-4954-8f50-00743b2f8934\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 22 04:28:45 np0005591762 ovs-ctl[47079]: Configuring Open vSwitch system IDs [  OK  ]
Jan 22 04:28:45 np0005591762 ovs-ctl[47079]: Enabling remote OVSDB managers [  OK  ]
Jan 22 04:28:45 np0005591762 systemd[1]: Started Open vSwitch Database Unit.
Jan 22 04:28:45 np0005591762 ovs-vsctl[47154]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 22 04:28:45 np0005591762 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 22 04:28:45 np0005591762 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 22 04:28:45 np0005591762 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 22 04:28:45 np0005591762 kernel: openvswitch: Open vSwitch switching datapath
Jan 22 04:28:45 np0005591762 ovs-ctl[47198]: Inserting openvswitch module [  OK  ]
Jan 22 04:28:45 np0005591762 ovs-ctl[47167]: Starting ovs-vswitchd [  OK  ]
Jan 22 04:28:45 np0005591762 ovs-ctl[47167]: Enabling remote OVSDB managers [  OK  ]
Jan 22 04:28:45 np0005591762 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 22 04:28:45 np0005591762 ovs-vsctl[47220]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 22 04:28:45 np0005591762 systemd[1]: Starting Open vSwitch...
Jan 22 04:28:45 np0005591762 systemd[1]: Finished Open vSwitch.
Jan 22 04:28:46 np0005591762 python3.9[47370]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:28:47 np0005591762 python3.9[47522]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 22 04:28:48 np0005591762 kernel: SELinux:  Converting 2750 SID table entries...
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:28:48 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:28:48 np0005591762 python3.9[47677]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:28:49 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 22 04:28:49 np0005591762 python3.9[47835]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:28:51 np0005591762 python3.9[47988]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:28:52 np0005591762 python3.9[48275]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 22 04:28:53 np0005591762 python3.9[48425]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:28:53 np0005591762 python3.9[48579]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:28:56 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:28:56 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:28:56 np0005591762 systemd[1]: Reloading.
Jan 22 04:28:56 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:28:56 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:28:56 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:28:56 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:28:56 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:28:56 np0005591762 systemd[1]: run-rf245921a29ea44088c4987400e4d0454.service: Deactivated successfully.
Jan 22 04:28:58 np0005591762 python3.9[48897]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:28:58 np0005591762 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 04:28:58 np0005591762 systemd[1]: Stopped Network Manager Wait Online.
Jan 22 04:28:58 np0005591762 systemd[1]: Stopping Network Manager Wait Online...
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0401] caught SIGTERM, shutting down normally.
Jan 22 04:28:58 np0005591762 systemd[1]: Stopping Network Manager...
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0412] dhcp4 (eth0): canceled DHCP transaction
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0412] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0412] dhcp4 (eth0): state changed no lease
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0413] dhcp6 (eth0): canceled DHCP transaction
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0413] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0413] dhcp6 (eth0): state changed no lease
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0415] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 04:28:58 np0005591762 NetworkManager[7277]: <info>  [1769074138.0439] exiting (success)
Jan 22 04:28:58 np0005591762 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 04:28:58 np0005591762 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 04:28:58 np0005591762 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 04:28:58 np0005591762 systemd[1]: Stopped Network Manager.
Jan 22 04:28:58 np0005591762 systemd[1]: Starting Network Manager...
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.0976] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:8b9e49f6-dfde-4886-8f0c-7f0567b85e9e)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.0978] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1018] manager[0x5635495f8000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 04:28:58 np0005591762 systemd[1]: Starting Hostname Service...
Jan 22 04:28:58 np0005591762 systemd[1]: Started Hostname Service.
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1604] hostname: hostname: using hostnamed
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1604] hostname: static hostname changed from (none) to "compute-2"
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1607] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1610] manager[0x5635495f8000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1610] manager[0x5635495f8000]: rfkill: WWAN hardware radio set enabled
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1625] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1631] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1632] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1632] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1633] manager: Networking is enabled by state file
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1635] settings: Loaded settings plugin: keyfile (internal)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1637] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1656] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1662] dhcp: init: Using DHCP client 'internal'
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1665] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1668] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1672] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1677] device (lo): Activation: starting connection 'lo' (8f4da813-534e-4822-a9a0-9dc45c872492)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1682] device (eth0): carrier: link connected
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1685] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1688] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1689] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1692] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1697] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1701] device (eth1): carrier: link connected
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1705] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1708] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (7f74ce3a-a81b-54b7-b052-8987ea8817f8) (indicated)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1709] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1712] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1717] device (eth1): Activation: starting connection 'ci-private-network' (7f74ce3a-a81b-54b7-b052-8987ea8817f8)
Jan 22 04:28:58 np0005591762 systemd[1]: Started Network Manager.
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1747] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1751] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1753] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1755] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1756] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1758] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1760] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1761] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1764] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1769] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1771] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1773] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1778] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1781] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1792] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1795] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1797] dhcp4 (eth0): state changed new lease, address=192.168.26.49
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1799] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1802] device (lo): Activation: successful, device activated.
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1809] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1830] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1832] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1835] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 22 04:28:58 np0005591762 NetworkManager[48910]: <info>  [1769074138.1836] device (eth1): Activation: successful, device activated.
Jan 22 04:28:58 np0005591762 systemd[1]: Starting Network Manager Wait Online...
Jan 22 04:28:58 np0005591762 python3.9[49106]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2464] dhcp6 (eth0): state changed new lease, address=2001:db8::1cd
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2472] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2501] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2502] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2505] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2506] device (eth0): Activation: successful, device activated.
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2510] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 04:28:59 np0005591762 NetworkManager[48910]: <info>  [1769074139.2511] manager: startup complete
Jan 22 04:28:59 np0005591762 systemd[1]: Finished Network Manager Wait Online.
Jan 22 04:29:05 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:29:05 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:29:05 np0005591762 systemd[1]: Reloading.
Jan 22 04:29:05 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:29:05 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:29:05 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:29:05 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:29:05 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:29:05 np0005591762 systemd[1]: run-r9b3e43d91665415098b7b7c78ded9fd5.service: Deactivated successfully.
Jan 22 04:29:09 np0005591762 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 04:29:10 np0005591762 python3.9[49586]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:29:10 np0005591762 python3.9[49738]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:11 np0005591762 python3.9[49892]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:11 np0005591762 python3.9[50044]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:12 np0005591762 python3.9[50198]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:12 np0005591762 python3.9[50350]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:13 np0005591762 python3.9[50502]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:29:13 np0005591762 python3.9[50625]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074153.125295-644-97847800400841/.source _original_basename=.n8zx9df4 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:14 np0005591762 python3.9[50777]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:15 np0005591762 python3.9[50929]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 22 04:29:15 np0005591762 python3.9[51081]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:17 np0005591762 python3.9[51508]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 22 04:29:18 np0005591762 ansible-async_wrapper.py[51683]: Invoked with j143823060016 300 /home/zuul/.ansible/tmp/ansible-tmp-1769074157.639033-842-90501019777277/AnsiballZ_edpm_os_net_config.py _
Jan 22 04:29:18 np0005591762 ansible-async_wrapper.py[51686]: Starting module and watcher
Jan 22 04:29:18 np0005591762 ansible-async_wrapper.py[51686]: Start watching 51687 (300)
Jan 22 04:29:18 np0005591762 ansible-async_wrapper.py[51687]: Start module (51687)
Jan 22 04:29:18 np0005591762 ansible-async_wrapper.py[51683]: Return async_wrapper task started.
Jan 22 04:29:18 np0005591762 python3.9[51688]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 22 04:29:18 np0005591762 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 22 04:29:18 np0005591762 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 22 04:29:18 np0005591762 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 22 04:29:18 np0005591762 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 22 04:29:18 np0005591762 kernel: cfg80211: failed to load regulatory.db
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.6875] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.6888] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7230] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7232] audit: op="connection-add" uuid="cf3bdcbc-21cf-41ab-bacb-326b964a98ba" name="br-ex-br" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7242] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7243] audit: op="connection-add" uuid="76f488d8-9c18-4565-83a9-f27767e4a3c7" name="br-ex-port" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7251] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7253] audit: op="connection-add" uuid="ff3f1c99-cfdd-4071-be39-2411aa7a76e9" name="eth1-port" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7261] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7262] audit: op="connection-add" uuid="7c76e99a-074e-439a-95c0-3fac3025a6f6" name="vlan20-port" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7270] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7271] audit: op="connection-add" uuid="8ac60fda-2373-450e-8222-eadb3418c552" name="vlan21-port" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7279] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7281] audit: op="connection-add" uuid="994192c0-a3aa-432d-b263-86b834349b68" name="vlan22-port" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7288] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7290] audit: op="connection-add" uuid="f9cb2e18-aa70-458e-8a96-e87cbce1f970" name="vlan23-port" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7304] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.may-fail,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.routes,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7315] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7317] audit: op="connection-add" uuid="c54f729a-9b0d-4d70-ad10-c1cd79fe5d84" name="br-ex-if" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7333] audit: op="connection-update" uuid="7f74ce3a-a81b-54b7-b052-8987ea8817f8" name="ci-private-network" args="ipv6.method,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.addresses,ipv6.dns,ipv6.routes,ovs-interface.type,ipv4.method,ipv4.routing-rules,ipv4.addresses,ipv4.dns,ipv4.never-default,ipv4.routes,connection.controller,connection.master,connection.slave-type,connection.timestamp,connection.port-type,ovs-external-ids.data" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7345] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7347] audit: op="connection-add" uuid="8f067023-8ff6-465d-8dfd-2f8a9c925cf6" name="vlan20-if" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7357] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7359] audit: op="connection-add" uuid="0eabd08f-4587-4e5f-bcde-525b4c3d274c" name="vlan21-if" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7369] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7371] audit: op="connection-add" uuid="bd52e410-88e3-4bf0-90f4-efed0f14d19a" name="vlan22-if" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7382] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7383] audit: op="connection-add" uuid="7823f680-d3b0-4975-b76b-61c66ad7e1d1" name="vlan23-if" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7391] audit: op="connection-delete" uuid="795c5902-e4d2-3a00-9d45-2dfdafd10a3d" name="Wired connection 1" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7399] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7401] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7406] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7409] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (cf3bdcbc-21cf-41ab-bacb-326b964a98ba)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7410] audit: op="connection-activate" uuid="cf3bdcbc-21cf-41ab-bacb-326b964a98ba" name="br-ex-br" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7411] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7412] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7416] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7419] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (76f488d8-9c18-4565-83a9-f27767e4a3c7)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7421] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7422] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7425] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7428] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (ff3f1c99-cfdd-4071-be39-2411aa7a76e9)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7429] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7430] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7434] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7437] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (7c76e99a-074e-439a-95c0-3fac3025a6f6)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7439] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7440] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7443] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7446] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (8ac60fda-2373-450e-8222-eadb3418c552)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7448] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7449] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7452] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7455] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (994192c0-a3aa-432d-b263-86b834349b68)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7457] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7458] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7461] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7465] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f9cb2e18-aa70-458e-8a96-e87cbce1f970)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7465] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7467] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7469] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7473] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7475] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7477] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7480] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c54f729a-9b0d-4d70-ad10-c1cd79fe5d84)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7481] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7483] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7485] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7486] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7487] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7493] device (eth1): disconnecting for new activation request.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7494] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7496] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7498] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7499] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7501] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7502] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7505] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7508] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8f067023-8ff6-465d-8dfd-2f8a9c925cf6)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7509] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7511] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7512] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7514] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7516] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7517] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7519] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7522] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (0eabd08f-4587-4e5f-bcde-525b4c3d274c)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7523] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7525] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7527] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7528] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7530] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7531] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7533] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7537] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (bd52e410-88e3-4bf0-90f4-efed0f14d19a)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7537] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7540] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7541] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7542] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7545] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <warn>  [1769074159.7546] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7548] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7551] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (7823f680-d3b0-4975-b76b-61c66ad7e1d1)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7552] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7554] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7555] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7557] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7558] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7567] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.may-fail,ipv6.addr-gen-mode,ipv6.routes,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,802-3-ethernet.mtu" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7568] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7570] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7572] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7577] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7579] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7582] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7584] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7586] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 kernel: ovs-system: entered promiscuous mode
Jan 22 04:29:19 np0005591762 kernel: Timeout policy base is empty
Jan 22 04:29:19 np0005591762 systemd-udevd[51694]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7611] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7613] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7615] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7616] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7619] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7621] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7623] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7624] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7626] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7631] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7633] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7634] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7637] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7641] dhcp4 (eth0): canceled DHCP transaction
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7642] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7643] dhcp4 (eth0): state changed no lease
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7643] dhcp6 (eth0): canceled DHCP transaction
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7644] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7644] dhcp6 (eth0): state changed no lease
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7648] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7706] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7710] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51689 uid=0 result="fail" reason="Device is not activated"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7730] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7734] dhcp4 (eth0): state changed new lease, address=192.168.26.49
Jan 22 04:29:19 np0005591762 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7763] device (eth1): disconnecting for new activation request.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7764] audit: op="connection-activate" uuid="7f74ce3a-a81b-54b7-b052-8987ea8817f8" name="ci-private-network" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7765] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7773] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7779] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7792] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 22 04:29:19 np0005591762 kernel: br-ex: entered promiscuous mode
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7868] device (eth1): Activation: starting connection 'ci-private-network' (7f74ce3a-a81b-54b7-b052-8987ea8817f8)
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7870] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7871] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51689 uid=0 result="success"
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7876] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7877] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7894] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7897] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7903] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7904] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7905] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7906] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7907] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7908] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7910] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7919] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7922] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7926] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7929] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7932] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7935] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7944] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7947] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 kernel: vlan22: entered promiscuous mode
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7956] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7959] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7962] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7966] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7977] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7980] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.7986] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8009] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8019] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 kernel: vlan23: entered promiscuous mode
Jan 22 04:29:19 np0005591762 systemd-udevd[51695]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8058] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8062] device (eth1): Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8079] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8080] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 kernel: vlan20: entered promiscuous mode
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8092] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8096] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8111] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 22 04:29:19 np0005591762 kernel: vlan21: entered promiscuous mode
Jan 22 04:29:19 np0005591762 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8172] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8186] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8256] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8257] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8265] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8270] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8274] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8277] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8281] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8284] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8306] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8310] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8330] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8331] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8334] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8337] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8342] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 04:29:19 np0005591762 NetworkManager[48910]: <info>  [1769074159.8346] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 04:29:20 np0005591762 NetworkManager[48910]: <info>  [1769074160.9428] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.0572] checkpoint[0x5635495ce950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.0574] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.1807] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.1816] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.3631] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.4664] checkpoint[0x5635495cea20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.4669] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.7033] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.7043] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.8730] audit: op="networking-control" arg="global-dns-configuration" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.8742] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf)
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.8747] audit: op="networking-control" arg="global-dns-configuration" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.8778] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=51689 uid=0 result="success"
Jan 22 04:29:21 np0005591762 python3.9[52044]: ansible-ansible.legacy.async_status Invoked with jid=j143823060016.51683 mode=status _async_dir=/root/.ansible_async
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.9950] checkpoint[0x5635495ceaf0]: destroy /org/freedesktop/NetworkManager/Checkpoint/3
Jan 22 04:29:21 np0005591762 NetworkManager[48910]: <info>  [1769074161.9953] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=51689 uid=0 result="success"
Jan 22 04:29:22 np0005591762 ansible-async_wrapper.py[51687]: Module complete (51687)
Jan 22 04:29:23 np0005591762 ansible-async_wrapper.py[51686]: Done in kid B.
Jan 22 04:29:25 np0005591762 python3.9[52148]: ansible-ansible.legacy.async_status Invoked with jid=j143823060016.51683 mode=status _async_dir=/root/.ansible_async
Jan 22 04:29:25 np0005591762 python3.9[52248]: ansible-ansible.legacy.async_status Invoked with jid=j143823060016.51683 mode=cleanup _async_dir=/root/.ansible_async
Jan 22 04:29:26 np0005591762 python3.9[52400]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:29:26 np0005591762 python3.9[52523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074165.8515992-923-221130424759732/.source.returncode _original_basename=.lv_mqz0p follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:27 np0005591762 python3.9[52675]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:29:27 np0005591762 python3.9[52798]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074166.8333461-971-191832519537980/.source.cfg _original_basename=.5ied5oco follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:28 np0005591762 python3.9[52950]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:29:28 np0005591762 systemd[1]: Reloading Network Manager...
Jan 22 04:29:28 np0005591762 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 04:29:28 np0005591762 NetworkManager[48910]: <info>  [1769074168.2020] audit: op="reload" arg="0" pid=52954 uid=0 result="success"
Jan 22 04:29:28 np0005591762 NetworkManager[48910]: <info>  [1769074168.2025] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 22 04:29:28 np0005591762 NetworkManager[48910]: <info>  [1769074168.2026] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 04:29:28 np0005591762 systemd[1]: Reloaded Network Manager.
Jan 22 04:29:28 np0005591762 systemd[1]: session-10.scope: Deactivated successfully.
Jan 22 04:29:28 np0005591762 systemd[1]: session-10.scope: Consumed 34.974s CPU time.
Jan 22 04:29:28 np0005591762 systemd-logind[744]: Session 10 logged out. Waiting for processes to exit.
Jan 22 04:29:28 np0005591762 systemd-logind[744]: Removed session 10.
Jan 22 04:29:33 np0005591762 systemd-logind[744]: New session 11 of user zuul.
Jan 22 04:29:33 np0005591762 systemd[1]: Started Session 11 of User zuul.
Jan 22 04:29:34 np0005591762 python3.9[53140]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:29:34 np0005591762 python3.9[53294]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:29:35 np0005591762 python3.9[53488]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:29:36 np0005591762 systemd[1]: session-11.scope: Deactivated successfully.
Jan 22 04:29:36 np0005591762 systemd[1]: session-11.scope: Consumed 1.686s CPU time.
Jan 22 04:29:36 np0005591762 systemd-logind[744]: Session 11 logged out. Waiting for processes to exit.
Jan 22 04:29:36 np0005591762 systemd-logind[744]: Removed session 11.
Jan 22 04:29:38 np0005591762 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 04:29:41 np0005591762 systemd-logind[744]: New session 12 of user zuul.
Jan 22 04:29:41 np0005591762 systemd[1]: Started Session 12 of User zuul.
Jan 22 04:29:42 np0005591762 python3.9[53670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:29:42 np0005591762 python3.9[53824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:29:43 np0005591762 python3.9[53980]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:29:44 np0005591762 python3.9[54064]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:29:45 np0005591762 python3.9[54218]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:29:46 np0005591762 python3.9[54413]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:47 np0005591762 python3.9[54565]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:29:47 np0005591762 systemd[1]: var-lib-containers-storage-overlay-compat2389784867-merged.mount: Deactivated successfully.
Jan 22 04:29:47 np0005591762 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1842741692-merged.mount: Deactivated successfully.
Jan 22 04:29:47 np0005591762 podman[54566]: 2026-01-22 09:29:47.17175372 +0000 UTC m=+0.044074033 system refresh
Jan 22 04:29:47 np0005591762 python3.9[54726]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:29:48 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:29:48 np0005591762 python3.9[54850]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074187.3230836-194-225405761479647/.source.json follow=False _original_basename=podman_network_config.j2 checksum=de481d27b0152dea66347f907ce80f2fe6035007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:29:48 np0005591762 python3.9[55002]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:29:49 np0005591762 python3.9[55125]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074188.4019318-239-179194456460401/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:29:49 np0005591762 python3.9[55277]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:29:50 np0005591762 python3.9[55429]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:29:50 np0005591762 python3.9[55581]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:29:51 np0005591762 python3.9[55734]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:29:51 np0005591762 python3.9[55886]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:29:53 np0005591762 python3.9[56039]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:29:54 np0005591762 python3.9[56193]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:29:54 np0005591762 python3.9[56345]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:29:55 np0005591762 python3.9[56497]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:29:56 np0005591762 python3.9[56650]: ansible-service_facts Invoked
Jan 22 04:29:56 np0005591762 network[56667]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:29:56 np0005591762 network[56668]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:29:56 np0005591762 network[56669]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:29:59 np0005591762 python3.9[57121]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:30:01 np0005591762 python3.9[57274]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 22 04:30:02 np0005591762 python3.9[57426]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:03 np0005591762 python3.9[57551]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074202.3670495-671-44692183249964/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:03 np0005591762 python3.9[57705]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:04 np0005591762 python3.9[57830]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074203.3280888-717-112548553323436/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:05 np0005591762 python3.9[57984]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:06 np0005591762 python3.9[58138]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:30:07 np0005591762 python3.9[58222]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:30:08 np0005591762 python3.9[58376]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:30:08 np0005591762 python3.9[58460]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:30:08 np0005591762 chronyd[753]: chronyd exiting
Jan 22 04:30:08 np0005591762 systemd[1]: Stopping NTP client/server...
Jan 22 04:30:08 np0005591762 systemd[1]: chronyd.service: Deactivated successfully.
Jan 22 04:30:08 np0005591762 systemd[1]: Stopped NTP client/server.
Jan 22 04:30:08 np0005591762 systemd[1]: Starting NTP client/server...
Jan 22 04:30:09 np0005591762 chronyd[58468]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 04:30:09 np0005591762 chronyd[58468]: Frequency -9.497 +/- 0.307 ppm read from /var/lib/chrony/drift
Jan 22 04:30:09 np0005591762 chronyd[58468]: Loaded seccomp filter (level 2)
Jan 22 04:30:09 np0005591762 systemd[1]: Started NTP client/server.
Jan 22 04:30:09 np0005591762 systemd[1]: session-12.scope: Deactivated successfully.
Jan 22 04:30:09 np0005591762 systemd[1]: session-12.scope: Consumed 17.682s CPU time.
Jan 22 04:30:09 np0005591762 systemd-logind[744]: Session 12 logged out. Waiting for processes to exit.
Jan 22 04:30:09 np0005591762 systemd-logind[744]: Removed session 12.
Jan 22 04:30:14 np0005591762 systemd-logind[744]: New session 13 of user zuul.
Jan 22 04:30:14 np0005591762 systemd[1]: Started Session 13 of User zuul.
Jan 22 04:30:15 np0005591762 python3.9[58649]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:15 np0005591762 python3.9[58801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:16 np0005591762 python3.9[58924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074215.433477-60-202774435401889/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:16 np0005591762 systemd[1]: session-13.scope: Deactivated successfully.
Jan 22 04:30:16 np0005591762 systemd[1]: session-13.scope: Consumed 1.112s CPU time.
Jan 22 04:30:16 np0005591762 systemd-logind[744]: Session 13 logged out. Waiting for processes to exit.
Jan 22 04:30:16 np0005591762 systemd-logind[744]: Removed session 13.
Jan 22 04:30:21 np0005591762 systemd-logind[744]: New session 14 of user zuul.
Jan 22 04:30:21 np0005591762 systemd[1]: Started Session 14 of User zuul.
Jan 22 04:30:22 np0005591762 python3.9[59102]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:30:23 np0005591762 python3.9[59258]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:23 np0005591762 python3.9[59433]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:24 np0005591762 python3.9[59556]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769074223.4284701-80-222205503102583/.source.json _original_basename=._yrtrywu follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:25 np0005591762 python3.9[59708]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:25 np0005591762 python3.9[59831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074224.7579165-149-184113444567727/.source _original_basename=.8baxmela follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:25 np0005591762 python3.9[59983]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:30:26 np0005591762 python3.9[60135]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:26 np0005591762 python3.9[60258]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074226.0971973-221-77241219054957/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:30:27 np0005591762 python3.9[60410]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:27 np0005591762 python3.9[60533]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074226.868149-221-80580658023269/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:30:28 np0005591762 python3.9[60685]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:28 np0005591762 python3.9[60837]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:28 np0005591762 python3.9[60960]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074228.176681-332-234505302661324/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:29 np0005591762 python3.9[61112]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:29 np0005591762 python3.9[61235]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074229.011271-377-18480080287797/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:30 np0005591762 python3.9[61387]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:30:30 np0005591762 systemd[1]: Reloading.
Jan 22 04:30:30 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:30:30 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:30:30 np0005591762 systemd[1]: Reloading.
Jan 22 04:30:30 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:30:30 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:30:30 np0005591762 systemd[1]: Starting EDPM Container Shutdown...
Jan 22 04:30:30 np0005591762 systemd[1]: Finished EDPM Container Shutdown.
Jan 22 04:30:31 np0005591762 python3.9[61614]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:31 np0005591762 python3.9[61737]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074230.9889534-446-241298759004978/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:32 np0005591762 python3.9[61889]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:32 np0005591762 python3.9[62012]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074231.9667056-491-162283057488830/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:33 np0005591762 python3.9[62164]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:30:33 np0005591762 systemd[1]: Reloading.
Jan 22 04:30:33 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:30:33 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:30:33 np0005591762 systemd[1]: Reloading.
Jan 22 04:30:33 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:30:33 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:30:33 np0005591762 systemd[1]: Starting Create netns directory...
Jan 22 04:30:33 np0005591762 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 04:30:33 np0005591762 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 04:30:33 np0005591762 systemd[1]: Finished Create netns directory.
Jan 22 04:30:34 np0005591762 python3.9[62390]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:30:34 np0005591762 network[62407]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:30:34 np0005591762 network[62408]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:30:34 np0005591762 network[62409]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:30:36 np0005591762 python3.9[62671]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:30:36 np0005591762 systemd[1]: Reloading.
Jan 22 04:30:36 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:30:36 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:30:36 np0005591762 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 22 04:30:36 np0005591762 iptables.init[62711]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 22 04:30:36 np0005591762 iptables.init[62711]: iptables: Flushing firewall rules: [  OK  ]
Jan 22 04:30:36 np0005591762 systemd[1]: iptables.service: Deactivated successfully.
Jan 22 04:30:36 np0005591762 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 22 04:30:37 np0005591762 python3.9[62907]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:30:37 np0005591762 python3.9[63061]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:30:38 np0005591762 systemd[1]: Reloading.
Jan 22 04:30:38 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:30:38 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:30:38 np0005591762 systemd[1]: Starting Netfilter Tables...
Jan 22 04:30:38 np0005591762 systemd[1]: Finished Netfilter Tables.
Jan 22 04:30:38 np0005591762 python3.9[63253]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:30:39 np0005591762 python3.9[63406]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:39 np0005591762 python3.9[63531]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074239.2357376-699-1825711895456/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:40 np0005591762 python3.9[63684]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:30:40 np0005591762 systemd[1]: Reloading OpenSSH server daemon...
Jan 22 04:30:40 np0005591762 systemd[1]: Reloaded OpenSSH server daemon.
Jan 22 04:30:40 np0005591762 python3.9[63840]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:41 np0005591762 python3.9[63992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:41 np0005591762 python3.9[64115]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074241.1424232-792-19067255929741/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:42 np0005591762 python3.9[64267]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 04:30:42 np0005591762 systemd[1]: Starting Time & Date Service...
Jan 22 04:30:42 np0005591762 systemd[1]: Started Time & Date Service.
Jan 22 04:30:43 np0005591762 python3.9[64423]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:43 np0005591762 python3.9[64575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:43 np0005591762 python3.9[64698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074243.3087552-897-251801954776261/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:44 np0005591762 python3.9[64850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:44 np0005591762 python3.9[64973]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074244.1289692-942-226352774496249/.source.yaml _original_basename=.ydctglaw follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:45 np0005591762 python3.9[65125]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:45 np0005591762 python3.9[65248]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074244.9709303-987-94892147717300/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:46 np0005591762 python3.9[65400]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:30:46 np0005591762 python3.9[65553]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:30:47 np0005591762 python3[65706]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 04:30:47 np0005591762 python3.9[65858]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:48 np0005591762 python3.9[65981]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074247.3241374-1104-41225683498764/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:48 np0005591762 python3.9[66133]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:48 np0005591762 python3.9[66256]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074248.1630816-1149-81529740004135/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:49 np0005591762 python3.9[66408]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:49 np0005591762 python3.9[66531]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074249.0176666-1194-171601771650457/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:50 np0005591762 python3.9[66683]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:50 np0005591762 python3.9[66806]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074249.8504267-1239-87842575108232/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:51 np0005591762 python3.9[66958]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:30:51 np0005591762 python3.9[67081]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074250.698605-1284-23396081143916/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:51 np0005591762 python3.9[67233]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:52 np0005591762 python3.9[67385]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:30:53 np0005591762 python3.9[67544]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:53 np0005591762 python3.9[67697]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:53 np0005591762 python3.9[67849]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:30:54 np0005591762 python3.9[68001]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 04:30:55 np0005591762 python3.9[68154]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 04:30:55 np0005591762 systemd-logind[744]: Session 14 logged out. Waiting for processes to exit.
Jan 22 04:30:55 np0005591762 systemd[1]: session-14.scope: Deactivated successfully.
Jan 22 04:30:55 np0005591762 systemd[1]: session-14.scope: Consumed 23.347s CPU time.
Jan 22 04:30:55 np0005591762 systemd-logind[744]: Removed session 14.
Jan 22 04:31:00 np0005591762 systemd-logind[744]: New session 15 of user zuul.
Jan 22 04:31:00 np0005591762 systemd[1]: Started Session 15 of User zuul.
Jan 22 04:31:00 np0005591762 python3.9[68335]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 22 04:31:01 np0005591762 python3.9[68487]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:31:01 np0005591762 python3.9[68639]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:31:02 np0005591762 python3.9[68791]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1l/4Hnab8cJ+0NgZRyND+668QQ18xCAMiTa4tJfwkacqv2+xu0AP833wzvRbj+BSz/GJYAjYZHtl/LPY/fgAiwZLhNui+6RFQXnMI+TWlUgadcYlxCFSLNXdeIU4VHKdxnYN8cw8WtM+PFaCdmFRk0NGTRLladuZ2Ft6qgEk/ocZCZ1hweLpc0NBPMupsV5ABFtNEZPBg5lEqxBdbFOY3MxlYJEKWIsWCyxu9jzoxc8ct4ejcM8FVx9pujC2XCWVumSYrXkp9LnbeYCOlxnalYYTgZWNh3ilMYw3g85DVUyF1ZECfbN4/uuu9emfUiC8EmIRofJTX7/IPDpqM0CgSFHt6gq45OgfrZ+YHcpPg8Bq5JWL3rpkIoZDiidmCCGrtku8huN9VGYcahOdJVixsNrfIS2jx9k86e19gNzUSKc3qxM6HCUrH0yEbXwcOcG6b1EcBllpJsHB3uXZNar6PeI2C+BkUQH/0520RqM7Zb0ZEg4+6S6i+Z11Ddhkn+Sk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEZOEP9uQiV1zH3a3aHqfWGEuJqzUo4rClu3BLMlWitr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNySjQgocwwOdUR7+1+vff+WJ7HHi2x7SZejx49o87M82KSvvvJ1bXTTeQ2yV4jf9DSKuJ6HcIHDr6bnAXEDEj8=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLkHp1/Qvor0RkXO+PvZvnJssDpVN93zM11quNN8iQ4KKQf8UHuKy+z84HXpOkzuxv1FNmR50SFPdR2h52T9/BEP+zzSmYli9cDaisI9zLQpghAnG+lXYjqsiPIXqR2z4IheTXQWRoc0c/9XzYCUMaMD73LVsv2ZTHG2Y7QfvK4MxYDPfGzTPihT0BaumTQQi1aKi5eILvXezyBhIgOrgWXDy73LvUS0A1PnwBTWjez2dmfEl2SozhpeqVRSmWdCZ8dRtXREfB6Mq/AC0SFrdQRYBB1fp6IKFrJhehXq8uN9YGQim7NDv95g1Vbg09hBzVMVRBut+meLFMgQicOFxX4cOH/zmBq2HZZ4NgoXQIttG2MWvRDeeOArcoiR4trg88CvXIKbHm7X3Xz124i1la6Znzd233vMLjW61sfm2BSiRvi2U199hCeHLpCKZDeXEfNKKws4/PCyJpilTrDhy01w/oqI6uKjCvuEpfNoDSqx4gfjAyjJboFWEV2ArMddk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAzNyDe1tBrOdz2+WL/pj9pc2M51PHCPiPpvoZYn4bHE#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBELYxft8jWfz1ywTUaPBtZwChEDFG53eKlkYcIDxgJP7KVnKVHGrkh7LMAVvlpn5gDq4gHPOx2/pvsvKR+u3AfU=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDx6FoZ1mQHUkExUKBX3RXUJtaZVmdK+/kJ75+oWOFtIZlx0mZcdVNn/rW0Q++oQhtNRWXFfZrC6xkhCT1INz4AehTVQ2y9DTa6PxylfZKv4SS0yNLP/UkFFMiKtWgxzfnFYniRmVr6pgKNAsIxOlGQHtYY9MzvNCU0rfxVJQV1DM7am+c3mbsqlU0w7R+Tur5zDSLFdysQdDqAk4UqlqkgYagUBOhC/cnkuUNOyj3idOKJhFrz/mnkO3P/KrXcgMPfFtu+yx5rQNDNyoZV1bp+uPgP8kvQGe5ol/cbTEiXlZ5BEgYcKbky8H1ICbcoiG5YcmEMNOm8s88fxvf6dJpdeAmjmraoHZtKson2jeZ7NsYgsjNhwKEElcxzAfhnhK+IfalpZhHQxGypR/IPlQrLlJOrbyAEIyk40nASUHxlJrOXP1lA9dvLaG/3KkIa2sPwaIgdVhzpmyodJds2sMg6cngRljDGY1UBTYGyo8vNNILFoCzMPNDcNCyY9xWYz8M=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICDQuz7VE0tTRnQJ96QrHIwmJh8osJY9A2+gmzkUlh54#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM7hnQz957+RtY0Mltzkw+lJRI4x2IlQwAuVKb+t24lorNdYqOmeiT8j8X9huVxPKGZSUxesKQ7YFrI9bxqNRo4=#012 create=True mode=0644 path=/tmp/ansible.chaqduzc state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:31:03 np0005591762 python3.9[68943]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.chaqduzc' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:31:03 np0005591762 python3.9[69097]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.chaqduzc state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:31:03 np0005591762 systemd[1]: session-15.scope: Deactivated successfully.
Jan 22 04:31:03 np0005591762 systemd[1]: session-15.scope: Consumed 2.321s CPU time.
Jan 22 04:31:03 np0005591762 systemd-logind[744]: Session 15 logged out. Waiting for processes to exit.
Jan 22 04:31:03 np0005591762 systemd-logind[744]: Removed session 15.
Jan 22 04:31:09 np0005591762 systemd-logind[744]: New session 16 of user zuul.
Jan 22 04:31:09 np0005591762 systemd[1]: Started Session 16 of User zuul.
Jan 22 04:31:09 np0005591762 python3.9[69275]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:31:10 np0005591762 python3.9[69431]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 04:31:11 np0005591762 python3.9[69585]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:31:12 np0005591762 python3.9[69738]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:31:12 np0005591762 python3.9[69891]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:31:12 np0005591762 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 04:31:13 np0005591762 python3.9[70047]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:31:13 np0005591762 python3.9[70202]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:31:14 np0005591762 systemd[1]: session-16.scope: Deactivated successfully.
Jan 22 04:31:14 np0005591762 systemd[1]: session-16.scope: Consumed 2.972s CPU time.
Jan 22 04:31:14 np0005591762 systemd-logind[744]: Session 16 logged out. Waiting for processes to exit.
Jan 22 04:31:14 np0005591762 systemd-logind[744]: Removed session 16.
Jan 22 04:31:19 np0005591762 systemd-logind[744]: New session 17 of user zuul.
Jan 22 04:31:19 np0005591762 systemd[1]: Started Session 17 of User zuul.
Jan 22 04:31:20 np0005591762 python3.9[70380]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:31:20 np0005591762 python3.9[70536]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:31:21 np0005591762 python3.9[70620]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 04:31:22 np0005591762 python3.9[70771]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:31:23 np0005591762 python3.9[70922]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 04:31:24 np0005591762 python3.9[71072]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:31:24 np0005591762 python3.9[71222]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:31:25 np0005591762 systemd[1]: session-17.scope: Deactivated successfully.
Jan 22 04:31:25 np0005591762 systemd[1]: session-17.scope: Consumed 4.128s CPU time.
Jan 22 04:31:25 np0005591762 systemd-logind[744]: Session 17 logged out. Waiting for processes to exit.
Jan 22 04:31:25 np0005591762 systemd-logind[744]: Removed session 17.
Jan 22 04:31:32 np0005591762 systemd-logind[744]: New session 18 of user zuul.
Jan 22 04:31:32 np0005591762 systemd[1]: Started Session 18 of User zuul.
Jan 22 04:31:36 np0005591762 python3[71988]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:31:37 np0005591762 python3[72079]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 22 04:31:38 np0005591762 python3[72106]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 22 04:31:38 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:31:39 np0005591762 python3[72133]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:31:39 np0005591762 kernel: loop: module loaded
Jan 22 04:31:39 np0005591762 kernel: loop3: detected capacity change from 0 to 41943040
Jan 22 04:31:39 np0005591762 python3[72168]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:31:39 np0005591762 lvm[72171]: PV /dev/loop3 not used.
Jan 22 04:31:39 np0005591762 lvm[72180]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 04:31:39 np0005591762 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 22 04:31:39 np0005591762 lvm[72182]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 22 04:31:39 np0005591762 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 22 04:31:39 np0005591762 python3[72260]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 04:31:40 np0005591762 python3[72333]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769074299.656197-37339-182249827411624/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:31:40 np0005591762 python3[72383]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:31:40 np0005591762 systemd[1]: Reloading.
Jan 22 04:31:40 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:31:40 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:31:40 np0005591762 systemd[1]: Starting Ceph OSD losetup...
Jan 22 04:31:40 np0005591762 bash[72423]: /dev/loop3: [64513]:4328461 (/var/lib/ceph-osd-0.img)
Jan 22 04:31:40 np0005591762 systemd[1]: Finished Ceph OSD losetup.
Jan 22 04:31:40 np0005591762 lvm[72424]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 04:31:40 np0005591762 lvm[72424]: VG ceph_vg0 finished
Jan 22 04:31:42 np0005591762 python3[72448]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:32:18 np0005591762 chronyd[58468]: Selected source 99.28.14.242 (pool.ntp.org)
Jan 22 04:32:56 np0005591762 systemd[1]: Created slice User Slice of UID 42477.
Jan 22 04:32:56 np0005591762 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 22 04:32:56 np0005591762 systemd-logind[744]: New session 19 of user ceph-admin.
Jan 22 04:32:56 np0005591762 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 22 04:32:56 np0005591762 systemd[1]: Starting User Manager for UID 42477...
Jan 22 04:32:56 np0005591762 systemd[72496]: Queued start job for default target Main User Target.
Jan 22 04:32:56 np0005591762 systemd[72496]: Created slice User Application Slice.
Jan 22 04:32:56 np0005591762 systemd[72496]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 04:32:56 np0005591762 systemd[72496]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 04:32:56 np0005591762 systemd[72496]: Reached target Paths.
Jan 22 04:32:56 np0005591762 systemd[72496]: Reached target Timers.
Jan 22 04:32:56 np0005591762 systemd[72496]: Starting D-Bus User Message Bus Socket...
Jan 22 04:32:56 np0005591762 systemd[72496]: Starting Create User's Volatile Files and Directories...
Jan 22 04:32:56 np0005591762 systemd[72496]: Listening on D-Bus User Message Bus Socket.
Jan 22 04:32:56 np0005591762 systemd[72496]: Reached target Sockets.
Jan 22 04:32:56 np0005591762 systemd[72496]: Finished Create User's Volatile Files and Directories.
Jan 22 04:32:56 np0005591762 systemd[72496]: Reached target Basic System.
Jan 22 04:32:56 np0005591762 systemd[1]: Started User Manager for UID 42477.
Jan 22 04:32:56 np0005591762 systemd[72496]: Reached target Main User Target.
Jan 22 04:32:56 np0005591762 systemd[72496]: Startup finished in 79ms.
Jan 22 04:32:56 np0005591762 systemd[1]: Started Session 19 of User ceph-admin.
Jan 22 04:32:56 np0005591762 systemd-logind[744]: New session 21 of user ceph-admin.
Jan 22 04:32:56 np0005591762 systemd[1]: Started Session 21 of User ceph-admin.
Jan 22 04:32:57 np0005591762 systemd-logind[744]: New session 22 of user ceph-admin.
Jan 22 04:32:57 np0005591762 systemd[1]: Started Session 22 of User ceph-admin.
Jan 22 04:32:57 np0005591762 systemd-logind[744]: New session 23 of user ceph-admin.
Jan 22 04:32:57 np0005591762 systemd[1]: Started Session 23 of User ceph-admin.
Jan 22 04:32:57 np0005591762 systemd-logind[744]: New session 24 of user ceph-admin.
Jan 22 04:32:57 np0005591762 systemd[1]: Started Session 24 of User ceph-admin.
Jan 22 04:32:57 np0005591762 systemd-logind[744]: New session 25 of user ceph-admin.
Jan 22 04:32:57 np0005591762 systemd[1]: Started Session 25 of User ceph-admin.
Jan 22 04:32:58 np0005591762 systemd-logind[744]: New session 26 of user ceph-admin.
Jan 22 04:32:58 np0005591762 systemd[1]: Started Session 26 of User ceph-admin.
Jan 22 04:32:58 np0005591762 systemd-logind[744]: New session 27 of user ceph-admin.
Jan 22 04:32:58 np0005591762 systemd[1]: Started Session 27 of User ceph-admin.
Jan 22 04:32:58 np0005591762 systemd-logind[744]: New session 28 of user ceph-admin.
Jan 22 04:32:58 np0005591762 systemd[1]: Started Session 28 of User ceph-admin.
Jan 22 04:32:59 np0005591762 systemd-logind[744]: New session 29 of user ceph-admin.
Jan 22 04:32:59 np0005591762 systemd[1]: Started Session 29 of User ceph-admin.
Jan 22 04:32:59 np0005591762 systemd-logind[744]: New session 30 of user ceph-admin.
Jan 22 04:32:59 np0005591762 systemd[1]: Started Session 30 of User ceph-admin.
Jan 22 04:33:00 np0005591762 systemd-logind[744]: New session 31 of user ceph-admin.
Jan 22 04:33:00 np0005591762 systemd[1]: Started Session 31 of User ceph-admin.
Jan 22 04:33:00 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:29 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:29 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:29 np0005591762 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73081 (sysctl)
Jan 22 04:33:30 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:30 np0005591762 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 22 04:33:30 np0005591762 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 22 04:33:30 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:32 np0005591762 systemd[1]: var-lib-containers-storage-overlay-compat1198358310-merged.mount: Deactivated successfully.
Jan 22 04:33:32 np0005591762 systemd[1]: var-lib-containers-storage-overlay-compat1198358310-lower\x2dmapped.mount: Deactivated successfully.
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.455484173 +0000 UTC m=+16.554382729 container create 7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_bose, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True)
Jan 22 04:33:47 np0005591762 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck4043544878-merged.mount: Deactivated successfully.
Jan 22 04:33:47 np0005591762 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 22 04:33:47 np0005591762 systemd[1]: Started libpod-conmon-7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd.scope.
Jan 22 04:33:47 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.522452064 +0000 UTC m=+16.621350620 container init 7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_bose, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.527145125 +0000 UTC m=+16.626043672 container start 7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_bose, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.528300371 +0000 UTC m=+16.627198927 container attach 7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_bose, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:33:47 np0005591762 hopeful_bose[73299]: 167 167
Jan 22 04:33:47 np0005591762 systemd[1]: libpod-7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd.scope: Deactivated successfully.
Jan 22 04:33:47 np0005591762 conmon[73299]: conmon 7177f4d32cfd63f56ca6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd.scope/container/memory.events
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.531882472 +0000 UTC m=+16.630781048 container died 7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_bose, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.443285015 +0000 UTC m=+16.542183570 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:47 np0005591762 systemd[1]: var-lib-containers-storage-overlay-0dff7d205e9a39356194988911b67a965f0ec56a2122959453bd17170b35762e-merged.mount: Deactivated successfully.
Jan 22 04:33:47 np0005591762 podman[73249]: 2026-01-22 09:33:47.547867432 +0000 UTC m=+16.646765989 container remove 7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_bose, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:33:47 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:47 np0005591762 systemd[1]: libpod-conmon-7177f4d32cfd63f56ca6df5a411168daa399fc1cc0e13820623df91f8a0fd6dd.scope: Deactivated successfully.
Jan 22 04:33:47 np0005591762 podman[73322]: 2026-01-22 09:33:47.660829955 +0000 UTC m=+0.028426637 container create 2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_meninsky, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:33:47 np0005591762 systemd[1]: Started libpod-conmon-2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6.scope.
Jan 22 04:33:47 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:33:47 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dd09d895e6ba163bd8232efc879bb1446cbcfdc4d94ec66b081aec0f0d4248f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:47 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dd09d895e6ba163bd8232efc879bb1446cbcfdc4d94ec66b081aec0f0d4248f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:47 np0005591762 podman[73322]: 2026-01-22 09:33:47.704335412 +0000 UTC m=+0.071932104 container init 2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_meninsky, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:33:47 np0005591762 podman[73322]: 2026-01-22 09:33:47.708715032 +0000 UTC m=+0.076311714 container start 2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_meninsky, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:33:47 np0005591762 podman[73322]: 2026-01-22 09:33:47.709910635 +0000 UTC m=+0.077507318 container attach 2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_meninsky, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:33:47 np0005591762 podman[73322]: 2026-01-22 09:33:47.649938955 +0000 UTC m=+0.017535657 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]: [
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:    {
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "available": false,
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "being_replaced": false,
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "ceph_device_lvm": false,
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "lsm_data": {},
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "lvs": [],
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "path": "/dev/sr0",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "rejected_reasons": [
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "Has a FileSystem",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "Insufficient space (<5GB)"
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        ],
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        "sys_api": {
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "actuators": null,
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "device_nodes": [
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:                "sr0"
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            ],
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "devname": "sr0",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "human_readable_size": "474.00 KB",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "id_bus": "ata",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "model": "QEMU DVD-ROM",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "nr_requests": "64",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "parent": "/dev/sr0",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "partitions": {},
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "path": "/dev/sr0",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "removable": "1",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "rev": "2.5+",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "ro": "0",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "rotational": "1",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "sas_address": "",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "sas_device_handle": "",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "scheduler_mode": "mq-deadline",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "sectors": 0,
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "sectorsize": "2048",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "size": 485376.0,
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "support_discard": "2048",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "type": "disk",
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:            "vendor": "QEMU"
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:        }
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]:    }
Jan 22 04:33:48 np0005591762 flamboyant_meninsky[73335]: ]
Jan 22 04:33:48 np0005591762 systemd[1]: libpod-2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6.scope: Deactivated successfully.
Jan 22 04:33:48 np0005591762 podman[74207]: 2026-01-22 09:33:48.22401923 +0000 UTC m=+0.017851982 container died 2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS)
Jan 22 04:33:48 np0005591762 podman[74207]: 2026-01-22 09:33:48.242034566 +0000 UTC m=+0.035867307 container remove 2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 22 04:33:48 np0005591762 systemd[1]: libpod-conmon-2965be258ea9da769fa46701a213374fec8e808854d75c6a767a894170ba51d6.scope: Deactivated successfully.
Jan 22 04:33:49 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:49 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.056243417 +0000 UTC m=+0.026096406 container create 28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_greider, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 22 04:33:50 np0005591762 systemd[1]: Started libpod-conmon-28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a.scope.
Jan 22 04:33:50 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.107497486 +0000 UTC m=+0.077350496 container init 28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_greider, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.111613931 +0000 UTC m=+0.081466921 container start 28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_greider, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.112622397 +0000 UTC m=+0.082475386 container attach 28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 04:33:50 np0005591762 priceless_greider[75206]: 167 167
Jan 22 04:33:50 np0005591762 systemd[1]: libpod-28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a.scope: Deactivated successfully.
Jan 22 04:33:50 np0005591762 conmon[75206]: conmon 28f3faf3cf530e3955b0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a.scope/container/memory.events
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.115213946 +0000 UTC m=+0.085066927 container died 28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.129606152 +0000 UTC m=+0.099459142 container remove 28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 22 04:33:50 np0005591762 podman[75193]: 2026-01-22 09:33:50.046125419 +0000 UTC m=+0.015978409 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:50 np0005591762 systemd[1]: libpod-conmon-28f3faf3cf530e3955b0c1f82441e20ca2501fbd9a8d3f4f2dbbe769ad8a3a6a.scope: Deactivated successfully.
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.169490484 +0000 UTC m=+0.024960507 container create 50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_engelbart, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:33:50 np0005591762 systemd[1]: Started libpod-conmon-50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1.scope.
Jan 22 04:33:50 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:33:50 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daaf88b0d6fe0ba7140214712e1eca4b492c9dfa67f81887010c7357cda2d4bb/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:50 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daaf88b0d6fe0ba7140214712e1eca4b492c9dfa67f81887010c7357cda2d4bb/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:50 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daaf88b0d6fe0ba7140214712e1eca4b492c9dfa67f81887010c7357cda2d4bb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:50 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daaf88b0d6fe0ba7140214712e1eca4b492c9dfa67f81887010c7357cda2d4bb/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.209259775 +0000 UTC m=+0.064729818 container init 50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_engelbart, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.215114996 +0000 UTC m=+0.070585019 container start 50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_engelbart, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.21707248 +0000 UTC m=+0.072542503 container attach 50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_engelbart, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 22 04:33:50 np0005591762 systemd[1]: libpod-50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1.scope: Deactivated successfully.
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.255800293 +0000 UTC m=+0.111270316 container died 50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_engelbart, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.159219543 +0000 UTC m=+0.014689586 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:50 np0005591762 podman[75220]: 2026-01-22 09:33:50.271465641 +0000 UTC m=+0.126935664 container remove 50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_engelbart, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 22 04:33:50 np0005591762 systemd[1]: libpod-conmon-50c941c762e04efcb69c65092493e765277dc55b45b47e17173fa60a9fd10da1.scope: Deactivated successfully.
Jan 22 04:33:50 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:50 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:50 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:50 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:50 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:50 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:50 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:50 np0005591762 systemd[1]: Reached target All Ceph clusters and services.
Jan 22 04:33:50 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:50 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:50 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:50 np0005591762 systemd[1]: Reached target Ceph cluster 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:33:50 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:50 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:50 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:51 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:51 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:51 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:51 np0005591762 systemd[1]: Created slice Slice /system/ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:33:51 np0005591762 systemd[1]: Reached target System Time Set.
Jan 22 04:33:51 np0005591762 systemd[1]: Reached target System Time Synchronized.
Jan 22 04:33:51 np0005591762 systemd[1]: Starting Ceph mon.compute-2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:33:51 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:51 np0005591762 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 04:33:51 np0005591762 podman[75503]: 2026-01-22 09:33:51.409226985 +0000 UTC m=+0.026061807 container create f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 22 04:33:51 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9b8f7732c816272640ce539695f15e8173cb548dbf5e5164708bdd5b18fc5f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:51 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9b8f7732c816272640ce539695f15e8173cb548dbf5e5164708bdd5b18fc5f0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:51 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9b8f7732c816272640ce539695f15e8173cb548dbf5e5164708bdd5b18fc5f0/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:51 np0005591762 podman[75503]: 2026-01-22 09:33:51.398673301 +0000 UTC m=+0.015508124 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:51 np0005591762 podman[75503]: 2026-01-22 09:33:51.722119424 +0000 UTC m=+0.338954247 container init f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:33:51 np0005591762 podman[75503]: 2026-01-22 09:33:51.726240041 +0000 UTC m=+0.343074864 container start f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: set uid:gid to 167:167 (ceph:ceph)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: pidfile_write: ignore empty --pid-file
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: load: jerasure load: lrc 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: RocksDB version: 7.9.2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Git sha 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: DB SUMMARY
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: DB Session ID:  N2XKLI89NZ0D85XH10TG
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: CURRENT file:  CURRENT
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: IDENTITY file:  IDENTITY
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                         Options.error_if_exists: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.create_if_missing: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                         Options.paranoid_checks: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                                     Options.env: 0x55a023befc20
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                                Options.info_log: 0x55a025f25a20
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.max_file_opening_threads: 16
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                              Options.statistics: (nil)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                               Options.use_fsync: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.max_log_file_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                         Options.allow_fallocate: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                        Options.use_direct_reads: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.create_missing_column_families: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                              Options.db_log_dir: 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                                 Options.wal_dir: 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.advise_random_on_open: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                    Options.write_buffer_manager: 0x55a025f29900
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                            Options.rate_limiter: (nil)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.unordered_write: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                               Options.row_cache: None
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                              Options.wal_filter: None
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.allow_ingest_behind: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.two_write_queues: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.manual_wal_flush: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.wal_compression: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.atomic_flush: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.log_readahead_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.allow_data_in_errors: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.db_host_id: __hostname__
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.max_background_jobs: 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.max_background_compactions: -1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.max_subcompactions: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.max_total_wal_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                          Options.max_open_files: -1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                          Options.bytes_per_sync: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:       Options.compaction_readahead_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.max_background_flushes: -1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Compression algorithms supported:
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kZSTD supported: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kXpressCompression supported: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kBZip2Compression supported: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kLZ4Compression supported: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kZlibCompression supported: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: #011kSnappyCompression supported: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:           Options.merge_operator: 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:        Options.compaction_filter: None
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a025f245c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a025f49350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:        Options.write_buffer_size: 33554432
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:  Options.max_write_buffer_number: 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.compression: NoCompression
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.num_levels: 7
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 06bb68f0-fa7d-4244-8ddc-ccad0aff042d
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074431756574, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074431757768, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074431757852, "job": 1, "event": "recovery_finished"}
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a025f4ae00
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: DB pointer 0x55a026054000
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.27 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.27 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a025f49350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 43df7a30-cf5f-5209-adfd-bf44298b19f2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(???) e0 preinit fsid 43df7a30-cf5f-5209-adfd-bf44298b19f2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012btime 2026-01-22T09:32:16:777810+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 2 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e11 crush map has features 3314933000852226048, adjusting msgr requires
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e11 crush map has features 288514051259236352, adjusting msgr requires
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e11 crush map has features 288514051259236352, adjusting msgr requires
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).osd e11 crush map has features 288514051259236352, adjusting msgr requires
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.1 [v2:192.168.122.101:6800/1582813310,v1:192.168.122.101:6801/1582813310]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.0 [v2:192.168.122.100:6802/1679360742,v1:192.168.122.100:6803/1679360742]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.1 [v2:192.168.122.101:6800/1582813310,v1:192.168.122.101:6801/1582813310]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.0 [v2:192.168.122.100:6802/1679360742,v1:192.168.122.100:6803/1679360742]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.0 [v2:192.168.122.100:6802/1679360742,v1:192.168.122.100:6803/1679360742]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.1 [v2:192.168.122.101:6800/1582813310,v1:192.168.122.101:6801/1582813310]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Adjusting osd_memory_target on compute-1 to  5248M
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Adjusting osd_memory_target on compute-0 to 128.7M
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Unable to set osd_memory_target on compute-0 to 134966067: error parsing value: Value '134966067' is below minimum 939524096
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.0 [v2:192.168.122.100:6802/1679360742,v1:192.168.122.100:6803/1679360742]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='osd.1 [v2:192.168.122.101:6800/1582813310,v1:192.168.122.101:6801/1582813310]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: OSD bench result of 21198.326412 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: OSD bench result of 18822.535110 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: osd.1 [v2:192.168.122.101:6800/1582813310,v1:192.168.122.101:6801/1582813310] boot
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: osd.0 [v2:192.168.122.100:6802/1679360742,v1:192.168.122.100:6803/1679360742] boot
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.conf
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Deploying daemon mon.compute-2 on compute-2
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: Cluster is now healthy
Jan 22 04:33:51 np0005591762 ceph-mon[75519]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 22 04:33:52 np0005591762 bash[75503]: f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4
Jan 22 04:33:52 np0005591762 systemd[1]: Started Ceph mon.compute-2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:33:53 np0005591762 ceph-mon[75519]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 22 04:33:53 np0005591762 ceph-mon[75519]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 22 04:33:53 np0005591762 ceph-mon[75519]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 22 04:33:53 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC 7763 64-Core Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:04:00.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7865364,os=Linux}
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-0 calling monitor election
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-2 calling monitor election
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: overall HEALTH_OK
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:33:56 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 22 04:33:57 np0005591762 ceph-mon[75519]: Deploying daemon mon.compute-1 on compute-1
Jan 22 04:33:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 22 04:33:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 22 04:33:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 22 04:33:58 np0005591762 ceph-mon[75519]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 22 04:33:58 np0005591762 ceph-mon[75519]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 22 04:33:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.565028468 +0000 UTC m=+0.031146311 container create ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_leavitt, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:33:58 np0005591762 systemd[1]: Started libpod-conmon-ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff.scope.
Jan 22 04:33:58 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.622187943 +0000 UTC m=+0.088305807 container init ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_leavitt, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.626933368 +0000 UTC m=+0.093051212 container start ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_leavitt, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:33:58 np0005591762 infallible_leavitt[75655]: 167 167
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.631437859 +0000 UTC m=+0.097555703 container attach ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_leavitt, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.631733456 +0000 UTC m=+0.097851441 container died ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_leavitt, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True)
Jan 22 04:33:58 np0005591762 systemd[1]: libpod-ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff.scope: Deactivated successfully.
Jan 22 04:33:58 np0005591762 systemd[1]: var-lib-containers-storage-overlay-3ee23490cf20a99edd52ca16b6ae9c70c4f6c5c865b54b8735bf5d5b9e9b7c2b-merged.mount: Deactivated successfully.
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.552097563 +0000 UTC m=+0.018215427 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:58 np0005591762 podman[75642]: 2026-01-22 09:33:58.654852004 +0000 UTC m=+0.120969848 container remove ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_leavitt, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:33:58 np0005591762 systemd[1]: libpod-conmon-ccc73d29629ec57bb08b12351ed82229bc8064d88745a357bc960b11dc8504ff.scope: Deactivated successfully.
Jan 22 04:33:58 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:58 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:58 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:58 np0005591762 systemd[1]: Reloading.
Jan 22 04:33:58 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:33:58 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:33:59 np0005591762 systemd[1]: Starting Ceph mgr.compute-2.bisona for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:33:59 np0005591762 podman[75786]: 2026-01-22 09:33:59.315343153 +0000 UTC m=+0.038631040 container create cc771d8f677d2e65881d21d0e7568af2905734c32c74919a07a91c6fe24788c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 22 04:33:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0cb4ed8319ea3f0507a0dbb87b1461cf53efa8a8f26ed7d46ee3ee42bd31273/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0cb4ed8319ea3f0507a0dbb87b1461cf53efa8a8f26ed7d46ee3ee42bd31273/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0cb4ed8319ea3f0507a0dbb87b1461cf53efa8a8f26ed7d46ee3ee42bd31273/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0cb4ed8319ea3f0507a0dbb87b1461cf53efa8a8f26ed7d46ee3ee42bd31273/merged/var/lib/ceph/mgr/ceph-compute-2.bisona supports timestamps until 2038 (0x7fffffff)
Jan 22 04:33:59 np0005591762 podman[75786]: 2026-01-22 09:33:59.359939647 +0000 UTC m=+0.083227525 container init cc771d8f677d2e65881d21d0e7568af2905734c32c74919a07a91c6fe24788c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:33:59 np0005591762 podman[75786]: 2026-01-22 09:33:59.364041369 +0000 UTC m=+0.087329247 container start cc771d8f677d2e65881d21d0e7568af2905734c32c74919a07a91c6fe24788c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 22 04:33:59 np0005591762 bash[75786]: cc771d8f677d2e65881d21d0e7568af2905734c32c74919a07a91c6fe24788c9
Jan 22 04:33:59 np0005591762 podman[75786]: 2026-01-22 09:33:59.300697616 +0000 UTC m=+0.023985503 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:33:59 np0005591762 systemd[1]: Started Ceph mgr.compute-2.bisona for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:33:59 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:33:59 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:33:59 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:00 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: Deploying daemon mgr.compute-2.bisona on compute-2
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-0 calling monitor election
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-2 calling monitor election
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-1 calling monitor election
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: overall HEALTH_OK
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.upcmhd", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.upcmhd", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: set uid:gid to 167:167 (ceph:ceph)
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: pidfile_write: ignore empty --pid-file
Jan 22 04:34:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'alerts'
Jan 22 04:34:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:03.924+0000 7f64ec755140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'balancer'
Jan 22 04:34:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:03.996+0000 7f64ec755140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:03 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'cephadm'
Jan 22 04:34:04 np0005591762 ceph-mon[75519]: Deploying daemon mgr.compute-1.upcmhd on compute-1
Jan 22 04:34:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:04 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'crash'
Jan 22 04:34:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:04.713+0000 7f64ec755140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:04 np0005591762 ceph-mgr[75802]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:04 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'dashboard'
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.73612959 +0000 UTC m=+0.058832010 container create 5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_darwin, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default)
Jan 22 04:34:04 np0005591762 systemd[1]: Started libpod-conmon-5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9.scope.
Jan 22 04:34:04 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.8097689 +0000 UTC m=+0.132471329 container init 5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_darwin, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.716195264 +0000 UTC m=+0.038897703 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.817317509 +0000 UTC m=+0.140019928 container start 5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_darwin, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.818934997 +0000 UTC m=+0.141637417 container attach 5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_darwin, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 22 04:34:04 np0005591762 laughing_darwin[75930]: 167 167
Jan 22 04:34:04 np0005591762 systemd[1]: libpod-5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9.scope: Deactivated successfully.
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.823332126 +0000 UTC m=+0.146034545 container died 5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_darwin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:34:04 np0005591762 systemd[1]: var-lib-containers-storage-overlay-02116c30dd1198a419c6a83f882cccccd0fc70af3c8358db6666b6454c0b20d9-merged.mount: Deactivated successfully.
Jan 22 04:34:04 np0005591762 podman[75917]: 2026-01-22 09:34:04.854608291 +0000 UTC m=+0.177310710 container remove 5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_darwin, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:04 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:04 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:04 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:05 np0005591762 systemd[1]: libpod-conmon-5ce9f44cd7ec685a08173b9570bfa01eb0fdc01b07bcc0096f583edb6ea848d9.scope: Deactivated successfully.
Jan 22 04:34:05 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'devicehealth'
Jan 22 04:34:05 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:05 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: Deploying daemon crash.compute-2 on compute-2
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1487156669' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e12 e12: 2 total, 2 up, 2 in
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:05.283+0000 7f64ec755140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'diskprediction_local'
Jan 22 04:34:05 np0005591762 systemd[1]: Starting Ceph crash.compute-2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]:  from numpy import show_config as show_numpy_config
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:05.427+0000 7f64ec755140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'influx'
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:05.489+0000 7f64ec755140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'insights'
Jan 22 04:34:05 np0005591762 podman[76059]: 2026-01-22 09:34:05.51703071 +0000 UTC m=+0.025403918 container create 90540f6c5eebea73717d6c9c52af7be33cc48e9d71309e470e0a12355fa79314 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 22 04:34:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd3c89a14e33083c81975a3bfe420258333ee692e9d82ad3b463db11dc2ed83a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd3c89a14e33083c81975a3bfe420258333ee692e9d82ad3b463db11dc2ed83a/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd3c89a14e33083c81975a3bfe420258333ee692e9d82ad3b463db11dc2ed83a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd3c89a14e33083c81975a3bfe420258333ee692e9d82ad3b463db11dc2ed83a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'iostat'
Jan 22 04:34:05 np0005591762 podman[76059]: 2026-01-22 09:34:05.562178584 +0000 UTC m=+0.070551811 container init 90540f6c5eebea73717d6c9c52af7be33cc48e9d71309e470e0a12355fa79314 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Jan 22 04:34:05 np0005591762 podman[76059]: 2026-01-22 09:34:05.566111046 +0000 UTC m=+0.074484254 container start 90540f6c5eebea73717d6c9c52af7be33cc48e9d71309e470e0a12355fa79314 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 22 04:34:05 np0005591762 bash[76059]: 90540f6c5eebea73717d6c9c52af7be33cc48e9d71309e470e0a12355fa79314
Jan 22 04:34:05 np0005591762 podman[76059]: 2026-01-22 09:34:05.506049932 +0000 UTC m=+0.014423160 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:05 np0005591762 systemd[1]: Started Ceph crash.compute-2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'k8sevents'
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:05.610+0000 7f64ec755140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.700+0000 7f9b59a19640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.700+0000 7f9b59a19640 -1 AuthRegistry(0x7f9b54069b10) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.701+0000 7f9b59a19640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.701+0000 7f9b59a19640 -1 AuthRegistry(0x7f9b59a17ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.701+0000 7f9b52ffd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.702+0000 7f9b537fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.702+0000 7f9b527fc640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: 2026-01-22T09:34:05.702+0000 7f9b59a19640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 22 04:34:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-crash-compute-2[76071]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 22 04:34:05 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'localpool'
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mds_autoscaler'
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.023619155 +0000 UTC m=+0.030917649 container create 3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_gauss, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 22 04:34:06 np0005591762 systemd[1]: Started libpod-conmon-3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b.scope.
Jan 22 04:34:06 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.076553055 +0000 UTC m=+0.083851580 container init 3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_gauss, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.081953414 +0000 UTC m=+0.089251919 container start 3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_gauss, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2)
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.084801162 +0000 UTC m=+0.092099666 container attach 3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_gauss, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:34:06 np0005591762 loving_gauss[76181]: 167 167
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.085863405 +0000 UTC m=+0.093161910 container died 3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_gauss, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:06 np0005591762 systemd[1]: libpod-3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b.scope: Deactivated successfully.
Jan 22 04:34:06 np0005591762 systemd[1]: var-lib-containers-storage-overlay-7be7a77123bc9006c69c4ade8e199d2d057e02e7095cfdae4546557d8d1451a0-merged.mount: Deactivated successfully.
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.104070044 +0000 UTC m=+0.111368548 container remove 3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_gauss, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 22 04:34:06 np0005591762 podman[76167]: 2026-01-22 09:34:06.010866596 +0000 UTC m=+0.018165101 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:06 np0005591762 systemd[1]: libpod-conmon-3b6d5f01d0589957220a6322f6185e55b4fea4d4b5608cf1fea909e3bb56ec1b.scope: Deactivated successfully.
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mirroring'
Jan 22 04:34:06 np0005591762 podman[76203]: 2026-01-22 09:34:06.217406171 +0000 UTC m=+0.029279603 container create 0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=youthful_shannon, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:06 np0005591762 systemd[1]: Started libpod-conmon-0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529.scope.
Jan 22 04:34:06 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c252a1716a0a75c67302570cb1699e20b3267f1b7a7cded6f3844ef7761ee3ca/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c252a1716a0a75c67302570cb1699e20b3267f1b7a7cded6f3844ef7761ee3ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c252a1716a0a75c67302570cb1699e20b3267f1b7a7cded6f3844ef7761ee3ca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c252a1716a0a75c67302570cb1699e20b3267f1b7a7cded6f3844ef7761ee3ca/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c252a1716a0a75c67302570cb1699e20b3267f1b7a7cded6f3844ef7761ee3ca/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1487156669' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1829889370' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 22 04:34:06 np0005591762 podman[76203]: 2026-01-22 09:34:06.282361271 +0000 UTC m=+0.094234713 container init 0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=youthful_shannon, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e13 e13: 2 total, 2 up, 2 in
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'nfs'
Jan 22 04:34:06 np0005591762 podman[76203]: 2026-01-22 09:34:06.288377832 +0000 UTC m=+0.100251254 container start 0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=youthful_shannon, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Jan 22 04:34:06 np0005591762 podman[76203]: 2026-01-22 09:34:06.290468373 +0000 UTC m=+0.102341794 container attach 0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=youthful_shannon, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 22 04:34:06 np0005591762 podman[76203]: 2026-01-22 09:34:06.205499217 +0000 UTC m=+0.017372658 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'orchestrator'
Jan 22 04:34:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:06.486+0000 7f64ec755140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 youthful_shannon[76217]: --> passed data devices: 0 physical, 1 LVM
Jan 22 04:34:06 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:06 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:06 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new e7fde3af-8dcc-4261-b14b-26da738aa0fb
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_perf_query'
Jan 22 04:34:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:06.675+0000 7f64ec755140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_support'
Jan 22 04:34:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:06.744+0000 7f64ec755140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e13 _set_new_cache_sizes cache_size:1019933756 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'pg_autoscaler'
Jan 22 04:34:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:06.803+0000 7f64ec755140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'progress'
Jan 22 04:34:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:06.872+0000 7f64ec755140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e14 e14: 3 total, 2 up, 3 in
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:06 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'prometheus'
Jan 22 04:34:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:06.934+0000 7f64ec755140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 22 04:34:07 np0005591762 lvm[76278]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 04:34:07 np0005591762 lvm[76278]: VG ceph_vg0 finished
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rbd_support'
Jan 22 04:34:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:07.231+0000 7f64ec755140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1829889370' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/1162022378' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "e7fde3af-8dcc-4261-b14b-26da738aa0fb"}]: dispatch
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/1162022378' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "e7fde3af-8dcc-4261-b14b-26da738aa0fb"}]': finished
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3569405702' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'restful'
Jan 22 04:34:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:07.316+0000 7f64ec755140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/17361913' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: stderr: got monmap epoch 3
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: --> Creating keyring file for osd.2
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 22 04:34:07 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid e7fde3af-8dcc-4261-b14b-26da738aa0fb --setuser ceph --setgroup ceph
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rgw'
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:07.695+0000 7f64ec755140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:07 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rook'
Jan 22 04:34:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e15 e15: 3 total, 2 up, 3 in
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.175+0000 7f64ec755140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'selftest'
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.237+0000 7f64ec755140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'snap_schedule'
Jan 22 04:34:08 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3569405702' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 22 04:34:08 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'stats'
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.307+0000 7f64ec755140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'status'
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.442+0000 7f64ec755140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telegraf'
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.504+0000 7f64ec755140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telemetry'
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.638+0000 7f64ec755140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'test_orchestrator'
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'volumes'
Jan 22 04:34:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:08.832+0000 7f64ec755140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e16 e16: 3 total, 2 up, 3 in
Jan 22 04:34:09 np0005591762 ceph-mgr[75802]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:09.076+0000 7f64ec755140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:09 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'zabbix'
Jan 22 04:34:09 np0005591762 ceph-mgr[75802]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:09.140+0000 7f64ec755140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:09 np0005591762 ceph-mgr[75802]: ms_deliver_dispatch: unhandled message 0x55723d5bed00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Jan 22 04:34:09 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/2679896897' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 22 04:34:09 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:09 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:09 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/2679896897' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 22 04:34:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e17 e17: 3 total, 2 up, 3 in
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: stderr: 2026-01-22T09:34:07.406+0000 7fc04a7ff740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: stderr: 2026-01-22T09:34:07.672+0000 7fc04a7ff740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 22 04:34:10 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1916958918' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 22 04:34:10 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1916958918' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 22 04:34:10 np0005591762 youthful_shannon[76217]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 22 04:34:10 np0005591762 systemd[1]: libpod-0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529.scope: Deactivated successfully.
Jan 22 04:34:10 np0005591762 systemd[1]: libpod-0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529.scope: Consumed 1.417s CPU time.
Jan 22 04:34:10 np0005591762 podman[76203]: 2026-01-22 09:34:10.453280862 +0000 UTC m=+4.265154283 container died 0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=youthful_shannon, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:10 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c252a1716a0a75c67302570cb1699e20b3267f1b7a7cded6f3844ef7761ee3ca-merged.mount: Deactivated successfully.
Jan 22 04:34:10 np0005591762 podman[76203]: 2026-01-22 09:34:10.496782732 +0000 UTC m=+4.308656153 container remove 0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=youthful_shannon, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:10 np0005591762 systemd[1]: libpod-conmon-0581758b73d42dc335f6c2a497cae42b4bab6e1b47fe2f3d2da54be9b789a529.scope: Deactivated successfully.
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.879910924 +0000 UTC m=+0.025037155 container create 223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_goldstine, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:10 np0005591762 systemd[1]: Started libpod-conmon-223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81.scope.
Jan 22 04:34:10 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e18 e18: 3 total, 2 up, 3 in
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.924485137 +0000 UTC m=+0.069611378 container init 223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_goldstine, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.928578363 +0000 UTC m=+0.073704594 container start 223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_goldstine, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.930877396 +0000 UTC m=+0.076003627 container attach 223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 22 04:34:10 np0005591762 optimistic_goldstine[77300]: 167 167
Jan 22 04:34:10 np0005591762 systemd[1]: libpod-223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81.scope: Deactivated successfully.
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.932100621 +0000 UTC m=+0.077226853 container died 223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid)
Jan 22 04:34:10 np0005591762 systemd[1]: var-lib-containers-storage-overlay-5e8fb8e09592948b26e3d3d230db9c7ea394685faae06c89870f6d127b011f10-merged.mount: Deactivated successfully.
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.950055336 +0000 UTC m=+0.095181568 container remove 223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_goldstine, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325)
Jan 22 04:34:10 np0005591762 podman[77287]: 2026-01-22 09:34:10.870022817 +0000 UTC m=+0.015149067 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:10 np0005591762 systemd[1]: libpod-conmon-223c93c29f6fab75eed2adac6e798ffda5a69cec7217d4c572edd00891a38d81.scope: Deactivated successfully.
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.058787435 +0000 UTC m=+0.026661899 container create 5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_noyce, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:11 np0005591762 systemd[1]: Started libpod-conmon-5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa.scope.
Jan 22 04:34:11 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:11 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/410f2ac6278ca78c261fb2a528f9c87ed2d6aff16bb69c7ee11e3b0ff796c440/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:11 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/410f2ac6278ca78c261fb2a528f9c87ed2d6aff16bb69c7ee11e3b0ff796c440/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:11 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/410f2ac6278ca78c261fb2a528f9c87ed2d6aff16bb69c7ee11e3b0ff796c440/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:11 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/410f2ac6278ca78c261fb2a528f9c87ed2d6aff16bb69c7ee11e3b0ff796c440/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.110808574 +0000 UTC m=+0.078683038 container init 5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_noyce, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.115954845 +0000 UTC m=+0.083829309 container start 5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_noyce, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.117190685 +0000 UTC m=+0.085065148 container attach 5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_noyce, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.047584728 +0000 UTC m=+0.015459212 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:11 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1481446863' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 22 04:34:11 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1481446863' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]: {
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:    "2": [
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:        {
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "devices": [
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "/dev/loop3"
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            ],
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "lv_name": "ceph_lv0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "lv_size": "21470642176",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=5CGdOr-lXKE-4X6Z-yuf0-4nIp-qXpU-D897Nm,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=43df7a30-cf5f-5209-adfd-bf44298b19f2,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=e7fde3af-8dcc-4261-b14b-26da738aa0fb,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "lv_uuid": "5CGdOr-lXKE-4X6Z-yuf0-4nIp-qXpU-D897Nm",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "name": "ceph_lv0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "tags": {
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.block_uuid": "5CGdOr-lXKE-4X6Z-yuf0-4nIp-qXpU-D897Nm",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.cephx_lockbox_secret": "",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.cluster_fsid": "43df7a30-cf5f-5209-adfd-bf44298b19f2",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.cluster_name": "ceph",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.crush_device_class": "",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.encrypted": "0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.osd_fsid": "e7fde3af-8dcc-4261-b14b-26da738aa0fb",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.osd_id": "2",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.type": "block",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.vdo": "0",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:                "ceph.with_tpm": "0"
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            },
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "type": "block",
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:            "vg_name": "ceph_vg0"
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:        }
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]:    ]
Jan 22 04:34:11 np0005591762 jolly_noyce[77335]: }
Jan 22 04:34:11 np0005591762 systemd[1]: libpod-5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa.scope: Deactivated successfully.
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.349580527 +0000 UTC m=+0.317455001 container died 5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_noyce, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:11 np0005591762 systemd[1]: var-lib-containers-storage-overlay-410f2ac6278ca78c261fb2a528f9c87ed2d6aff16bb69c7ee11e3b0ff796c440-merged.mount: Deactivated successfully.
Jan 22 04:34:11 np0005591762 podman[77322]: 2026-01-22 09:34:11.369009299 +0000 UTC m=+0.336883764 container remove 5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_noyce, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Jan 22 04:34:11 np0005591762 systemd[1]: libpod-conmon-5b13137431f9b63b7acca02e3bded40150596d3e4c97f144088bd35ea01698fa.scope: Deactivated successfully.
Jan 22 04:34:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e18 _set_new_cache_sizes cache_size:1020053121 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.767564282 +0000 UTC m=+0.026286061 container create b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_mayer, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:11 np0005591762 systemd[1]: Started libpod-conmon-b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2.scope.
Jan 22 04:34:11 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.815000158 +0000 UTC m=+0.073721946 container init b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_mayer, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.819090008 +0000 UTC m=+0.077811786 container start b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_mayer, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:11 np0005591762 vibrant_mayer[77453]: 167 167
Jan 22 04:34:11 np0005591762 systemd[1]: libpod-b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2.scope: Deactivated successfully.
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.822229776 +0000 UTC m=+0.080951554 container attach b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_mayer, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.822530543 +0000 UTC m=+0.081252341 container died b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_mayer, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 22 04:34:11 np0005591762 systemd[1]: var-lib-containers-storage-overlay-94226df25f7e15f17a9881a37dbb951b49140539f427095b2739a09669ea1e1a-merged.mount: Deactivated successfully.
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.842229816 +0000 UTC m=+0.100951594 container remove b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_mayer, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:11 np0005591762 podman[77439]: 2026-01-22 09:34:11.756550031 +0000 UTC m=+0.015271828 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:11 np0005591762 systemd[1]: libpod-conmon-b8683820ad1f593c89a93380641b9c21e7ded7426f95d074377fd6506a10fce2.scope: Deactivated successfully.
Jan 22 04:34:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.012777867 +0000 UTC m=+0.027336340 container create d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 22 04:34:12 np0005591762 systemd[1]: Started libpod-conmon-d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd.scope.
Jan 22 04:34:12 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cbfa0bb069e8cbe5c3d0f9e0422fdbef7192da6840d0604aebdc0d49d8c5510/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cbfa0bb069e8cbe5c3d0f9e0422fdbef7192da6840d0604aebdc0d49d8c5510/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cbfa0bb069e8cbe5c3d0f9e0422fdbef7192da6840d0604aebdc0d49d8c5510/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cbfa0bb069e8cbe5c3d0f9e0422fdbef7192da6840d0604aebdc0d49d8c5510/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cbfa0bb069e8cbe5c3d0f9e0422fdbef7192da6840d0604aebdc0d49d8c5510/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.078208533 +0000 UTC m=+0.092767016 container init d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.082633284 +0000 UTC m=+0.097191757 container start d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.083657625 +0000 UTC m=+0.098216098 container attach d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.002189949 +0000 UTC m=+0.016748422 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test[77494]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 22 04:34:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test[77494]:                            [--no-systemd] [--no-tmpfs]
Jan 22 04:34:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test[77494]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 22 04:34:12 np0005591762 systemd[1]: libpod-d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd.scope: Deactivated successfully.
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.232728491 +0000 UTC m=+0.247286964 container died d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 22 04:34:12 np0005591762 systemd[1]: var-lib-containers-storage-overlay-1cbfa0bb069e8cbe5c3d0f9e0422fdbef7192da6840d0604aebdc0d49d8c5510-merged.mount: Deactivated successfully.
Jan 22 04:34:12 np0005591762 podman[77481]: 2026-01-22 09:34:12.252391486 +0000 UTC m=+0.266949959 container remove d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate-test, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:34:12 np0005591762 systemd[1]: libpod-conmon-d204fe37c574652b013d4fd7e5377d86b7626825d1b0c16145f6da91741225fd.scope: Deactivated successfully.
Jan 22 04:34:12 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 22 04:34:12 np0005591762 ceph-mon[75519]: Deploying daemon osd.2 on compute-2
Jan 22 04:34:12 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/2133931872' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 22 04:34:12 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/2133931872' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 22 04:34:12 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:12 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:12 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:12 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:12 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:12 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:12 np0005591762 systemd[1]: Starting Ceph osd.2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:34:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 22 04:34:12 np0005591762 podman[77645]: 2026-01-22 09:34:12.980646779 +0000 UTC m=+0.026215086 container create 59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 22 04:34:13 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:13 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ba80e8e6b004f519be1198c4b2eaf863843caed6be3b3358eda6bb3c117d70/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:13 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ba80e8e6b004f519be1198c4b2eaf863843caed6be3b3358eda6bb3c117d70/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:13 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ba80e8e6b004f519be1198c4b2eaf863843caed6be3b3358eda6bb3c117d70/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:13 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ba80e8e6b004f519be1198c4b2eaf863843caed6be3b3358eda6bb3c117d70/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:13 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ba80e8e6b004f519be1198c4b2eaf863843caed6be3b3358eda6bb3c117d70/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:13 np0005591762 podman[77645]: 2026-01-22 09:34:13.039380873 +0000 UTC m=+0.084949169 container init 59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:13 np0005591762 podman[77645]: 2026-01-22 09:34:13.045027808 +0000 UTC m=+0.090596105 container start 59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 22 04:34:13 np0005591762 podman[77645]: 2026-01-22 09:34:13.046381199 +0000 UTC m=+0.091949496 container attach 59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:13 np0005591762 podman[77645]: 2026-01-22 09:34:12.969497884 +0000 UTC m=+0.015066200 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 ceph-mon[75519]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 22 04:34:13 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1682354899' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 22 04:34:13 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1682354899' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 22 04:34:13 np0005591762 lvm[77739]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 04:34:13 np0005591762 lvm[77739]: VG ceph_vg0 finished
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 22 04:34:13 np0005591762 bash[77645]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 22 04:34:13 np0005591762 bash[77645]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 22 04:34:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate[77657]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 22 04:34:13 np0005591762 bash[77645]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 22 04:34:13 np0005591762 systemd[1]: libpod-59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110.scope: Deactivated successfully.
Jan 22 04:34:13 np0005591762 podman[77645]: 2026-01-22 09:34:13.990001584 +0000 UTC m=+1.035569880 container died 59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:14 np0005591762 systemd[1]: var-lib-containers-storage-overlay-56ba80e8e6b004f519be1198c4b2eaf863843caed6be3b3358eda6bb3c117d70-merged.mount: Deactivated successfully.
Jan 22 04:34:14 np0005591762 podman[77645]: 2026-01-22 09:34:14.011733558 +0000 UTC m=+1.057301855 container remove 59a37a5de1e53048aa35f639ea12570c016d044780a1aed9bc4ffe0766d92110 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2-activate, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:34:14 np0005591762 podman[77895]: 2026-01-22 09:34:14.152885619 +0000 UTC m=+0.027620747 container create b8b2de4b7a35fe611fd8753af17448582cff054d6cacc0890c90387c0a754cee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d673dc708ce81260c81df5dbcaa6920a1a56e78627a16e6f09092dbedf0e920f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d673dc708ce81260c81df5dbcaa6920a1a56e78627a16e6f09092dbedf0e920f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d673dc708ce81260c81df5dbcaa6920a1a56e78627a16e6f09092dbedf0e920f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d673dc708ce81260c81df5dbcaa6920a1a56e78627a16e6f09092dbedf0e920f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d673dc708ce81260c81df5dbcaa6920a1a56e78627a16e6f09092dbedf0e920f/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 podman[77895]: 2026-01-22 09:34:14.199134697 +0000 UTC m=+0.073869835 container init b8b2de4b7a35fe611fd8753af17448582cff054d6cacc0890c90387c0a754cee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 22 04:34:14 np0005591762 podman[77895]: 2026-01-22 09:34:14.203000244 +0000 UTC m=+0.077735362 container start b8b2de4b7a35fe611fd8753af17448582cff054d6cacc0890c90387c0a754cee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid)
Jan 22 04:34:14 np0005591762 bash[77895]: b8b2de4b7a35fe611fd8753af17448582cff054d6cacc0890c90387c0a754cee
Jan 22 04:34:14 np0005591762 podman[77895]: 2026-01-22 09:34:14.141899529 +0000 UTC m=+0.016634667 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:14 np0005591762 systemd[1]: Started Ceph osd.2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: set uid:gid to 167:167 (ceph:ceph)
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: pidfile_write: ignore empty --pid-file
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:14 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1107707631' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 22 04:34:14 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:14 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:14 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.626234926 +0000 UTC m=+0.028259790 container create 9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_roentgen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:34:14 np0005591762 systemd[1]: Started libpod-conmon-9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3.scope.
Jan 22 04:34:14 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.690697778 +0000 UTC m=+0.092722652 container init 9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.695453081 +0000 UTC m=+0.097477936 container start 9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_roentgen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.696557333 +0000 UTC m=+0.098582188 container attach 9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_roentgen, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True)
Jan 22 04:34:14 np0005591762 gifted_roentgen[78026]: 167 167
Jan 22 04:34:14 np0005591762 systemd[1]: libpod-9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3.scope: Deactivated successfully.
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.699804324 +0000 UTC m=+0.101829178 container died 9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_roentgen, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.614254843 +0000 UTC m=+0.016279718 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:14 np0005591762 systemd[1]: var-lib-containers-storage-overlay-479acaa3fb75dde9d19b6dc02ded24d93239926c9943c6c96f19f1670fc5f285-merged.mount: Deactivated successfully.
Jan 22 04:34:14 np0005591762 podman[78013]: 2026-01-22 09:34:14.717003314 +0000 UTC m=+0.119028168 container remove 9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_roentgen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:14 np0005591762 systemd[1]: libpod-conmon-9fa595b6a7d4d8e9b896b3741e4f140fcb093a6d40e00ac25d686ea12cdad5b3.scope: Deactivated successfully.
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb13c38000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb13c38000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb13c38000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb13c38000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 22 04:34:14 np0005591762 ceph-osd[77912]: bdev(0x55bb13c38000 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:14 np0005591762 podman[78048]: 2026-01-22 09:34:14.838996851 +0000 UTC m=+0.036161376 container create c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True)
Jan 22 04:34:14 np0005591762 systemd[1]: Started libpod-conmon-c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795.scope.
Jan 22 04:34:14 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c03c2ab01dc495106fd23f2ce19cdb2e36aa093ebb4a0277636821fef9c700e9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c03c2ab01dc495106fd23f2ce19cdb2e36aa093ebb4a0277636821fef9c700e9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c03c2ab01dc495106fd23f2ce19cdb2e36aa093ebb4a0277636821fef9c700e9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c03c2ab01dc495106fd23f2ce19cdb2e36aa093ebb4a0277636821fef9c700e9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:14 np0005591762 podman[78048]: 2026-01-22 09:34:14.893290122 +0000 UTC m=+0.090454647 container init c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_dubinsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:14 np0005591762 podman[78048]: 2026-01-22 09:34:14.899560753 +0000 UTC m=+0.096725278 container start c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_dubinsky, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:14 np0005591762 podman[78048]: 2026-01-22 09:34:14.900796442 +0000 UTC m=+0.097960977 container attach c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_dubinsky, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:14 np0005591762 podman[78048]: 2026-01-22 09:34:14.826392011 +0000 UTC m=+0.023556546 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb12e2d800 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:15 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1107707631' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 22 04:34:15 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/578410127' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 22 04:34:15 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: load: jerasure load: lrc 
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:15 np0005591762 lvm[78149]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 04:34:15 np0005591762 lvm[78149]: VG ceph_vg0 finished
Jan 22 04:34:15 np0005591762 kind_dubinsky[78068]: {}
Jan 22 04:34:15 np0005591762 systemd[1]: libpod-c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795.scope: Deactivated successfully.
Jan 22 04:34:15 np0005591762 podman[78048]: 2026-01-22 09:34:15.41654338 +0000 UTC m=+0.613707905 container died c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_dubinsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 22 04:34:15 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c03c2ab01dc495106fd23f2ce19cdb2e36aa093ebb4a0277636821fef9c700e9-merged.mount: Deactivated successfully.
Jan 22 04:34:15 np0005591762 podman[78048]: 2026-01-22 09:34:15.439362673 +0000 UTC m=+0.636527178 container remove c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_dubinsky, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Jan 22 04:34:15 np0005591762 systemd[1]: libpod-conmon-c27863d4d512191b8dd23b3cd71f0b617047e237b385a8d9e28cd35c60a9b795.scope: Deactivated successfully.
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:15 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:16 np0005591762 podman[78307]: 2026-01-22 09:34:16.07557629 +0000 UTC m=+0.037527120 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:16 np0005591762 podman[78307]: 2026-01-22 09:34:16.155633206 +0000 UTC m=+0.117584036 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/578410127' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1249265143' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13c39c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount shared_bdev_used = 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: RocksDB version: 7.9.2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Git sha 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DB SUMMARY
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DB Session ID:  0IP6X9644EM20CB5PUE2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: CURRENT file:  CURRENT
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: IDENTITY file:  IDENTITY
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.error_if_exists: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.create_if_missing: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.paranoid_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                     Options.env: 0x55bb12e81650
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                Options.info_log: 0x55bb13cad6c0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_file_opening_threads: 16
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                              Options.statistics: (nil)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.use_fsync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.max_log_file_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.allow_fallocate: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.use_direct_reads: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.create_missing_column_families: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                              Options.db_log_dir: 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                 Options.wal_dir: db.wal
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.advise_random_on_open: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.write_buffer_manager: 0x55bb13dc8a00
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                            Options.rate_limiter: (nil)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.unordered_write: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.row_cache: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                              Options.wal_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.allow_ingest_behind: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.two_write_queues: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.manual_wal_flush: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.wal_compression: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.atomic_flush: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.log_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.allow_data_in_errors: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.db_host_id: __hostname__
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_background_jobs: 4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_background_compactions: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_subcompactions: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.max_open_files: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.bytes_per_sync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.max_background_flushes: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Compression algorithms supported:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kZSTD supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kXpressCompression supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kBZip2Compression supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kLZ4Compression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kZlibCompression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kSnappyCompression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cada80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cadaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec29b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cadaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec29b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cadaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec29b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 6e2ff2ca-ffb3-4ab3-96a1-7ee9c19a6793
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456465014, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456465220, "job": 1, "event": "recovery_finished"}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: freelist init
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: freelist _read_cfg
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs umount
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) close
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bdev(0x55bb13e60000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluefs mount shared_bdev_used = 4718592
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: RocksDB version: 7.9.2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Git sha 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DB SUMMARY
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DB Session ID:  0IP6X9644EM20CB5PUE3
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: CURRENT file:  CURRENT
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: IDENTITY file:  IDENTITY
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.error_if_exists: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.create_if_missing: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.paranoid_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                     Options.env: 0x55bb12e81110
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                Options.info_log: 0x55bb13cad860
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_file_opening_threads: 16
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                              Options.statistics: (nil)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.use_fsync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.max_log_file_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.allow_fallocate: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.use_direct_reads: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.create_missing_column_families: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                              Options.db_log_dir: 
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                                 Options.wal_dir: db.wal
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.advise_random_on_open: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.write_buffer_manager: 0x55bb13dc8aa0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                            Options.rate_limiter: (nil)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.unordered_write: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.row_cache: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                              Options.wal_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.allow_ingest_behind: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.two_write_queues: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.manual_wal_flush: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.wal_compression: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.atomic_flush: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.log_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.allow_data_in_errors: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.db_host_id: __hostname__
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_background_jobs: 4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_background_compactions: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_subcompactions: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.max_open_files: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.bytes_per_sync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.max_background_flushes: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Compression algorithms supported:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kZSTD supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kXpressCompression supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kBZip2Compression supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kLZ4Compression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kZlibCompression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: #011kSnappyCompression supported: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec3350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad9e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec29b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad9e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec29b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:           Options.merge_operator: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.compaction_filter_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.sst_partitioner_factory: None
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb13cad9e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb12ec29b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.write_buffer_size: 16777216
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.max_write_buffer_number: 64
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.compression: LZ4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.num_levels: 7
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.level: 32767
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.compression_opts.strategy: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                  Options.compression_opts.enabled: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.arena_block_size: 1048576
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.disable_auto_compactions: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.inplace_update_support: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.bloom_locality: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                    Options.max_successive_merges: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.paranoid_file_checks: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.force_consistency_checks: 1
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.report_bg_io_stats: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                               Options.ttl: 2592000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                       Options.enable_blob_files: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                           Options.min_blob_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                          Options.blob_file_size: 268435456
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb:                Options.blob_file_starting_level: 0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e23 _set_new_cache_sizes cache_size:1020054710 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 6e2ff2ca-ffb3-4ab3-96a1-7ee9c19a6793
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456768739, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456770428, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074456, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6e2ff2ca-ffb3-4ab3-96a1-7ee9c19a6793", "db_session_id": "0IP6X9644EM20CB5PUE3", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456775459, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074456, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6e2ff2ca-ffb3-4ab3-96a1-7ee9c19a6793", "db_session_id": "0IP6X9644EM20CB5PUE3", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456776807, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074456, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6e2ff2ca-ffb3-4ab3-96a1-7ee9c19a6793", "db_session_id": "0IP6X9644EM20CB5PUE3", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074456777270, "job": 1, "event": "recovery_finished"}
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.777395301 +0000 UTC m=+0.053297836 container create 8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_raman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True)
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bb13eba000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: DB pointer 0x55bb13e76000
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 460.80 MB usag
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: _get_class not permitted to load lua
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: _get_class not permitted to load sdk
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 load_pgs
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 load_pgs opened 0 pgs
Jan 22 04:34:16 np0005591762 ceph-osd[77912]: osd.2 0 log_to_monitors true
Jan 22 04:34:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2[77908]: 2026-01-22T09:34:16.787+0000 7fed2ee30740 -1 osd.2 0 log_to_monitors true
Jan 22 04:34:16 np0005591762 systemd[1]: Started libpod-conmon-8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07.scope.
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Jan 22 04:34:16 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/2745861301,v1:192.168.122.102:6801/2745861301]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 22 04:34:16 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.822157547 +0000 UTC m=+0.098060112 container init 8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_raman, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.826851636 +0000 UTC m=+0.102754181 container start 8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_raman, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.828597196 +0000 UTC m=+0.104499741 container attach 8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_raman, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 22 04:34:16 np0005591762 modest_raman[78879]: 167 167
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.829863514 +0000 UTC m=+0.105766059 container died 8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_raman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:34:16 np0005591762 systemd[1]: libpod-8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07.scope: Deactivated successfully.
Jan 22 04:34:16 np0005591762 systemd[1]: var-lib-containers-storage-overlay-068c5c0c7ab5808d90c5c6ad7ac3cb06b7636fba2260e6612a0be415aecf9324-merged.mount: Deactivated successfully.
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.846226367 +0000 UTC m=+0.122128912 container remove 8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_raman, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:16 np0005591762 podman[78650]: 2026-01-22 09:34:16.766664023 +0000 UTC m=+0.042566578 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:16 np0005591762 systemd[1]: libpod-conmon-8994d5e4dd074a1cb14ecba110b54f7ab7bf8cd59abe67bbdbdec76f8a832e07.scope: Deactivated successfully.
Jan 22 04:34:16 np0005591762 podman[78901]: 2026-01-22 09:34:16.957151239 +0000 UTC m=+0.027373139 container create c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_rosalind, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 22 04:34:16 np0005591762 systemd[1]: Started libpod-conmon-c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae.scope.
Jan 22 04:34:17 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:17 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d80f3ea5cfef15bce4d5a01aefc0fcf524e8ed3037259146d6642b7ced04eef/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:17 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d80f3ea5cfef15bce4d5a01aefc0fcf524e8ed3037259146d6642b7ced04eef/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:17 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d80f3ea5cfef15bce4d5a01aefc0fcf524e8ed3037259146d6642b7ced04eef/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:17 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d80f3ea5cfef15bce4d5a01aefc0fcf524e8ed3037259146d6642b7ced04eef/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:17 np0005591762 podman[78901]: 2026-01-22 09:34:17.01068995 +0000 UTC m=+0.080911839 container init c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_rosalind, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS)
Jan 22 04:34:17 np0005591762 podman[78901]: 2026-01-22 09:34:17.015688812 +0000 UTC m=+0.085910701 container start c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 22 04:34:17 np0005591762 podman[78901]: 2026-01-22 09:34:17.01678038 +0000 UTC m=+0.087002280 container attach c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_rosalind, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:34:17 np0005591762 podman[78901]: 2026-01-22 09:34:16.94622314 +0000 UTC m=+0.016445051 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1249265143' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: from='osd.2 [v2:192.168.122.102:6800/2745861301,v1:192.168.122.102:6801/2745861301]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/909696606' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]: [
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:    {
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "available": false,
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "being_replaced": false,
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "ceph_device_lvm": false,
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "lsm_data": {},
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "lvs": [],
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "path": "/dev/sr0",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "rejected_reasons": [
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "Insufficient space (<5GB)",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "Has a FileSystem"
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        ],
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        "sys_api": {
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "actuators": null,
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "device_nodes": [
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:                "sr0"
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            ],
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "devname": "sr0",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "human_readable_size": "474.00 KB",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "id_bus": "ata",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "model": "QEMU DVD-ROM",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "nr_requests": "64",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "parent": "/dev/sr0",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "partitions": {},
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "path": "/dev/sr0",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "removable": "1",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "rev": "2.5+",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "ro": "0",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "rotational": "1",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "sas_address": "",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "sas_device_handle": "",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "scheduler_mode": "mq-deadline",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "sectors": 0,
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "sectorsize": "2048",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "size": 485376.0,
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "support_discard": "2048",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "type": "disk",
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:            "vendor": "QEMU"
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:        }
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]:    }
Jan 22 04:34:17 np0005591762 hardcore_rosalind[78914]: ]
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 22 04:34:17 np0005591762 systemd[1]: libpod-c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae.scope: Deactivated successfully.
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Jan 22 04:34:17 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/2745861301,v1:192.168.122.102:6801/2745861301]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 22 04:34:17 np0005591762 conmon[78914]: conmon c6cdd7cc891c5e5a4a7a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae.scope/container/memory.events
Jan 22 04:34:17 np0005591762 podman[78901]: 2026-01-22 09:34:17.495612994 +0000 UTC m=+0.565834894 container died c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_rosalind, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 22 04:34:17 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 22 04:34:17 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 22 04:34:18 np0005591762 systemd[1]: var-lib-containers-storage-overlay-9d80f3ea5cfef15bce4d5a01aefc0fcf524e8ed3037259146d6642b7ced04eef-merged.mount: Deactivated successfully.
Jan 22 04:34:18 np0005591762 podman[78901]: 2026-01-22 09:34:18.096716197 +0000 UTC m=+1.166938087 container remove c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 22 04:34:18 np0005591762 systemd[1]: libpod-conmon-c6cdd7cc891c5e5a4a7a52319e0145d490433061cd2229da2133e7be534a16ae.scope: Deactivated successfully.
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0 done with init, starting boot process
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0 start_boot
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 22 04:34:18 np0005591762 ceph-osd[77912]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/909696606' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='osd.2 [v2:192.168.122.102:6800/2745861301,v1:192.168.122.102:6801/2745861301]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:18 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Adjusting osd_memory_target on compute-2 to 128.7M
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Unable to set osd_memory_target on compute-2 to 134966067: error parsing value: Value '134966067' is below minimum 939524096
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.conf
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.conf
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.conf
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Health check cleared: POOL_APP_NOT_ENABLED (was: 2 pool(s) do not have an application enabled)
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Cluster is now healthy
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:19 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 86.971 iops: 22264.553 elapsed_sec: 0.135
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: log_channel(cluster) log [WRN] : OSD bench result of 22264.553075 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 22 04:34:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2[77908]: 2026-01-22T09:34:20.245+0000 7fed2adb3640 -1 osd.2 0 waiting for initial osdmap
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 0 waiting for initial osdmap
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 check_osdmap_features require_osd_release unknown -> squid
Jan 22 04:34:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-osd-2[77908]: 2026-01-22T09:34:20.262+0000 7fed263db640 -1 osd.2 25 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 set_numa_affinity not setting numa affinity
Jan 22 04:34:20 np0005591762 ceph-osd[77912]: osd.2 25 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 22 04:34:20 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:20 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/4274044498' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 22 04:34:20 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/4274044498' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e26 e26: 3 total, 3 up, 3 in
Jan 22 04:34:21 np0005591762 ceph-osd[77912]: osd.2 26 state: booting -> active
Jan 22 04:34:21 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 26 pg[3.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=13/13 les/c/f=15/15/0 sis=26) [2] r=0 lpr=26 pi=[13,26)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:34:21 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=26) [2] r=0 lpr=26 pi=[16,26)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: OSD bench result of 22264.553075 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1861953495' entity='client.admin' 
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: osd.2 [v2:192.168.122.102:6800/2745861301,v1:192.168.122.102:6801/2745861301] boot
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e27 e27: 3 total, 3 up, 3 in
Jan 22 04:34:22 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 27 pg[3.0( empty local-lis/les=26/27 n=0 ec=13/13 lis/c=13/13 les/c/f=15/15/0 sis=26) [2] r=0 lpr=26 pi=[13,26)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:34:22 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=26/27 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=26) [2] r=0 lpr=26 pi=[16,26)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:34:22 np0005591762 ceph-mon[75519]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 22 04:34:22 np0005591762 ceph-mon[75519]: Saving service ingress.rgw.default spec with placement count:2
Jan 22 04:34:22 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:22 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:22 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.rfmoog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:23 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring mgr.compute-0.rfmoog (monmap changed)...
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring daemon mgr.compute-0.rfmoog on compute-0
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Saving service node-exporter spec with placement *
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Saving service grafana spec with placement compute-0;count:1
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Saving service prometheus spec with placement compute-0;count:1
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Saving service alertmanager spec with placement compute-0;count:1
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring osd.0 (monmap changed)...
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring daemon osd.0 on compute-0
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1083578268' entity='client.admin' 
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 22 04:34:24 np0005591762 ceph-mon[75519]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1823157614' entity='client.admin' 
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: Reconfiguring osd.1 (monmap changed)...
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: Reconfiguring daemon osd.1 on compute-1
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/4171560874' entity='client.admin' 
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 22 04:34:25 np0005591762 ceph-mon[75519]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.094752433 +0000 UTC m=+0.028089890 container create 193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nervous_turing, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:34:26 np0005591762 systemd[1]: Started libpod-conmon-193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de.scope.
Jan 22 04:34:26 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.151166493 +0000 UTC m=+0.084503950 container init 193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nervous_turing, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.155153839 +0000 UTC m=+0.088491296 container start 193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nervous_turing, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.156234697 +0000 UTC m=+0.089572173 container attach 193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nervous_turing, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:26 np0005591762 nervous_turing[80432]: 167 167
Jan 22 04:34:26 np0005591762 systemd[1]: libpod-193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de.scope: Deactivated successfully.
Jan 22 04:34:26 np0005591762 conmon[80432]: conmon 193a1b4ad264c7a6e78a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de.scope/container/memory.events
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.158684304 +0000 UTC m=+0.092021760 container died 193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nervous_turing, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:26 np0005591762 systemd[1]: var-lib-containers-storage-overlay-b09af75d403bce4f5c498a739ff08275924dc81862b2d1fa04535b3472fce78f-merged.mount: Deactivated successfully.
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.174796304 +0000 UTC m=+0.108133762 container remove 193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nervous_turing, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 22 04:34:26 np0005591762 podman[80419]: 2026-01-22 09:34:26.08429458 +0000 UTC m=+0.017632047 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:26 np0005591762 systemd[1]: libpod-conmon-193a1b4ad264c7a6e78a5734224a1d36de9da6921035f5dcb3321de5742272de.scope: Deactivated successfully.
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.515766601 +0000 UTC m=+0.025770619 container create dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jackson, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 22 04:34:26 np0005591762 systemd[1]: Started libpod-conmon-dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423.scope.
Jan 22 04:34:26 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.564477981 +0000 UTC m=+0.074482009 container init dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jackson, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.571167391 +0000 UTC m=+0.081171408 container start dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jackson, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.572702414 +0000 UTC m=+0.082706432 container attach dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jackson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Jan 22 04:34:26 np0005591762 awesome_jackson[80525]: 167 167
Jan 22 04:34:26 np0005591762 systemd[1]: libpod-dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423.scope: Deactivated successfully.
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.574313741 +0000 UTC m=+0.084317760 container died dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:34:26 np0005591762 systemd[1]: var-lib-containers-storage-overlay-0f542b0be3a25f10c365a8bb9ff052ca51cbde871ef19bec1de1fd6f7557e156-merged.mount: Deactivated successfully.
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.59118858 +0000 UTC m=+0.101192598 container remove dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jackson, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:26 np0005591762 podman[80511]: 2026-01-22 09:34:26.504651859 +0000 UTC m=+0.014655887 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:26 np0005591762 systemd[1]: libpod-conmon-dd6624a490abc4598a22d1f61132aeee27d9c0bb5992f3c70cda5901cccb0423.scope: Deactivated successfully.
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e27 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: Reconfiguring mgr.compute-2.bisona (monmap changed)...
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.bisona", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: Reconfiguring daemon mgr.compute-2.bisona on compute-2
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/308023789' entity='client.admin' 
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:26 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:27 np0005591762 podman[80647]: 2026-01-22 09:34:27.043083047 +0000 UTC m=+0.031698112 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Jan 22 04:34:27 np0005591762 podman[80647]: 2026-01-22 09:34:27.120639099 +0000 UTC m=+0.109254174 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:27 np0005591762 python3[80741]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:34:28 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/285753853' entity='client.admin' 
Jan 22 04:34:29 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/245233264' entity='client.admin' 
Jan 22 04:34:29 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1610688415' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1610688415' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3348958984' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aqqfbf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aqqfbf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: from='mgr.14122 192.168.122.100:0/608607566' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:30 np0005591762 ceph-mon[75519]: Deploying daemon rgw.rgw.compute-2.aqqfbf on compute-2
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  1: '-n'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  2: 'mgr.compute-2.bisona'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  3: '-f'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  4: '--setuser'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  5: 'ceph'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  6: '--setgroup'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  7: 'ceph'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  8: '--default-log-to-file=false'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  9: '--default-log-to-journald=true'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr respawn  exe_path /proc/self/exe
Jan 22 04:34:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setuser ceph since I am not root
Jan 22 04:34:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setgroup ceph since I am not root
Jan 22 04:34:30 np0005591762 systemd[1]: session-28.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 28 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 28.
Jan 22 04:34:30 np0005591762 systemd[1]: session-22.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 22 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd[1]: session-19.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 19 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 22.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 19.
Jan 22 04:34:30 np0005591762 systemd[1]: session-21.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 21 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd[1]: session-23.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd[1]: session-27.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 31 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 27 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 23 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 21.
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: pidfile_write: ignore empty --pid-file
Jan 22 04:34:30 np0005591762 systemd[1]: session-29.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 23.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 29 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd[1]: session-30.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 27.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 30 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd[1]: session-26.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 26 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd[1]: session-24.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 24 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd[1]: session-25.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Session 25 logged out. Waiting for processes to exit.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 29.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 30.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 26.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 24.
Jan 22 04:34:30 np0005591762 systemd-logind[744]: Removed session 25.
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'alerts'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:30.778+0000 7f58cea98140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'balancer'
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:30 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'cephadm'
Jan 22 04:34:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:30.849+0000 7f58cea98140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.882070191 +0000 UTC m=+0.024091680 container create 00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_bose, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:30 np0005591762 systemd[1]: Started libpod-conmon-00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79.scope.
Jan 22 04:34:30 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.924644197 +0000 UTC m=+0.066665686 container init 00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_bose, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.929471898 +0000 UTC m=+0.071493387 container start 00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_bose, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.93144146 +0000 UTC m=+0.073462969 container attach 00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_bose, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:34:30 np0005591762 frosty_bose[80868]: 167 167
Jan 22 04:34:30 np0005591762 systemd[1]: libpod-00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 conmon[80868]: conmon 00b1a3bcbd186ab3986f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79.scope/container/memory.events
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.933107029 +0000 UTC m=+0.075128518 container died 00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_bose, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:30 np0005591762 systemd[1]: var-lib-containers-storage-overlay-3df30ed859671094e6268373c8b375cb04c2825461f4dc94d8e460ccaf391dd4-merged.mount: Deactivated successfully.
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.951182559 +0000 UTC m=+0.093204048 container remove 00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_bose, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:30 np0005591762 podman[80855]: 2026-01-22 09:34:30.87217436 +0000 UTC m=+0.014195869 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:30 np0005591762 systemd[1]: libpod-conmon-00b1a3bcbd186ab3986f79020f8823083129924e48df28c73b7d726c6dc2ff79.scope: Deactivated successfully.
Jan 22 04:34:30 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:31 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:31 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:31 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:31 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:31 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:31 np0005591762 systemd[1]: Starting Ceph rgw.rgw.compute-2.aqqfbf for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:34:31 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'crash'
Jan 22 04:34:31 np0005591762 podman[81008]: 2026-01-22 09:34:31.533802297 +0000 UTC m=+0.027542675 container create 3d8621025344e4b9022303661ceb74fe8e580d837a868d1cfdd3cf46e60aef4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-rgw-rgw-compute-2-aqqfbf, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:34:31 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c757b9e1d4490326c1dc1b691d8f3281e4e3210a5797ca97c2c50d50b86411e8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:31 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c757b9e1d4490326c1dc1b691d8f3281e4e3210a5797ca97c2c50d50b86411e8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:31 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c757b9e1d4490326c1dc1b691d8f3281e4e3210a5797ca97c2c50d50b86411e8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:31 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c757b9e1d4490326c1dc1b691d8f3281e4e3210a5797ca97c2c50d50b86411e8/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.aqqfbf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:31 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3348958984' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 22 04:34:31 np0005591762 ceph-mgr[75802]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:31 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'dashboard'
Jan 22 04:34:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:31.571+0000 7f58cea98140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:31 np0005591762 podman[81008]: 2026-01-22 09:34:31.581787582 +0000 UTC m=+0.075527960 container init 3d8621025344e4b9022303661ceb74fe8e580d837a868d1cfdd3cf46e60aef4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-rgw-rgw-compute-2-aqqfbf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:34:31 np0005591762 podman[81008]: 2026-01-22 09:34:31.585386526 +0000 UTC m=+0.079126894 container start 3d8621025344e4b9022303661ceb74fe8e580d837a868d1cfdd3cf46e60aef4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-rgw-rgw-compute-2-aqqfbf, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 22 04:34:31 np0005591762 bash[81008]: 3d8621025344e4b9022303661ceb74fe8e580d837a868d1cfdd3cf46e60aef4e
Jan 22 04:34:31 np0005591762 podman[81008]: 2026-01-22 09:34:31.521677843 +0000 UTC m=+0.015418232 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:34:31 np0005591762 systemd[1]: Started Ceph rgw.rgw.compute-2.aqqfbf for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:34:31 np0005591762 systemd[1]: session-31.scope: Deactivated successfully.
Jan 22 04:34:31 np0005591762 systemd[1]: session-31.scope: Consumed 47.896s CPU time.
Jan 22 04:34:31 np0005591762 systemd-logind[744]: Removed session 31.
Jan 22 04:34:31 np0005591762 radosgw[81024]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 22 04:34:31 np0005591762 radosgw[81024]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 22 04:34:31 np0005591762 radosgw[81024]: framework: beast
Jan 22 04:34:31 np0005591762 radosgw[81024]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 22 04:34:31 np0005591762 radosgw[81024]: init_numa not setting numa affinity
Jan 22 04:34:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e27 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'devicehealth'
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'diskprediction_local'
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:32.121+0000 7f58cea98140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]:  from numpy import show_config as show_numpy_config
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'influx'
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:32.262+0000 7f58cea98140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'insights'
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:32.323+0000 7f58cea98140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'iostat'
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'k8sevents'
Jan 22 04:34:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:32.442+0000 7f58cea98140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e28 e28: 3 total, 3 up, 3 in
Jan 22 04:34:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Jan 22 04:34:32 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2506501459' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'localpool'
Jan 22 04:34:32 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mds_autoscaler'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mirroring'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'nfs'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:33.294+0000 7f58cea98140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'orchestrator'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:33.482+0000 7f58cea98140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_perf_query'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:33.548+0000 7f58cea98140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_support'
Jan 22 04:34:33 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 22 04:34:33 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/2506501459' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 22 04:34:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e29 e29: 3 total, 3 up, 3 in
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:33.609+0000 7f58cea98140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'pg_autoscaler'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:33.682+0000 7f58cea98140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'progress'
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:33.743+0000 7f58cea98140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:33 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'prometheus'
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:34.043+0000 7f58cea98140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rbd_support'
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:34.131+0000 7f58cea98140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'restful'
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rgw'
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:34.508+0000 7f58cea98140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rook'
Jan 22 04:34:34 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 22 04:34:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e30 e30: 3 total, 3 up, 3 in
Jan 22 04:34:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Jan 22 04:34:34 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:34.989+0000 7f58cea98140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:34 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'selftest'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.051+0000 7f58cea98140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'snap_schedule'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.120+0000 7f58cea98140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'stats'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'status'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.248+0000 7f58cea98140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telegraf'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.309+0000 7f58cea98140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telemetry'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.443+0000 7f58cea98140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'test_orchestrator'
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e31 e31: 3 total, 3 up, 3 in
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.635+0000 7f58cea98140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'volumes'
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.864+0000 7f58cea98140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'zabbix'
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e32 e32: 3 total, 3 up, 3 in
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon metadata", "id": "compute-1"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon metadata", "id": "compute-2"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.rfmoog", "id": "compute-0.rfmoog"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mgr metadata", "who": "compute-0.rfmoog", "id": "compute-0.rfmoog"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-2.bisona", "id": "compute-2.bisona"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mgr metadata", "who": "compute-2.bisona", "id": "compute-2.bisona"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-1.upcmhd", "id": "compute-1.upcmhd"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mgr metadata", "who": "compute-1.upcmhd", "id": "compute-1.upcmhd"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds metadata"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e1 all = 1
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon metadata"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:35.931+0000 7f58cea98140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: ms_deliver_dispatch: unhandled message 0x561421dc1860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: mgr load Constructed class from module: dashboard
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 22 04:34:35 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Starting engine...
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"} v 0)
Jan 22 04:34:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Engine started...
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"} v 0)
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"}]: dispatch
Jan 22 04:34:36 np0005591762 systemd-logind[744]: New session 32 of user ceph-admin.
Jan 22 04:34:36 np0005591762 systemd[1]: Started Session 32 of User ceph-admin.
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: Active manager daemon compute-0.rfmoog restarted
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: Activating manager daemon compute-0.rfmoog
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: Manager daemon compute-0.rfmoog is now available
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"}]: dispatch
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:36 np0005591762 podman[81733]: 2026-01-22 09:34:36.781835394 +0000 UTC m=+0.034017308 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:36 np0005591762 podman[81733]: 2026-01-22 09:34:36.858137974 +0000 UTC m=+0.110319888 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/GRAFANA_API_USERNAME}] v 0)
Jan 22 04:34:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-2.devices.0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-2}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-1.devices.0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-1}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/GRAFANA_API_PASSWORD}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-2.devices.0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-2}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"} v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:37] ENGINE Bus STARTING
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:37] ENGINE Serving on http://192.168.122.100:8765
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:37] ENGINE Serving on https://192.168.122.100:7150
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:37] ENGINE Bus STARTED
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:37] ENGINE Client ('192.168.122.100', 40276) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-1.devices.0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-1}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Jan 22 04:34:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/ALERTMANAGER_API_HOST}] v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Adjusting osd_memory_target on compute-1 to 128.7M
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Unable to set osd_memory_target on compute-1 to 134966067: error parsing value: Value '134966067' is below minimum 939524096
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Adjusting osd_memory_target on compute-0 to 128.7M
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Unable to set osd_memory_target on compute-0 to 134966067: error parsing value: Value '134966067' is below minimum 939524096
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 192.168.122.100:0/392909353' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.conf
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.conf
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.conf
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Jan 22 04:34:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/PROMETHEUS_API_HOST}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-2.devices.0}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-2}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-1.devices.0}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-1}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/GRAFANA_API_URL}] v 0)
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.102:0/3366931585' entity='client.rgw.rgw.compute-2.aqqfbf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: from='mgr.14385 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 22 04:34:39 np0005591762 radosgw[81024]: v1 topic migration: starting v1 topic migration..
Jan 22 04:34:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-rgw-rgw-compute-2-aqqfbf[81020]: 2026-01-22T09:34:39.984+0000 7fdab2a79980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 22 04:34:39 np0005591762 radosgw[81024]: LDAP not started since no server URIs were provided in the configuration.
Jan 22 04:34:39 np0005591762 radosgw[81024]: v1 topic migration: finished v1 topic migration
Jan 22 04:34:39 np0005591762 radosgw[81024]: framework: beast
Jan 22 04:34:39 np0005591762 radosgw[81024]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 22 04:34:39 np0005591762 radosgw[81024]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 22 04:34:40 np0005591762 radosgw[81024]: starting handler: beast
Jan 22 04:34:40 np0005591762 radosgw[81024]: set uid:gid to 167:167 (ceph:ceph)
Jan 22 04:34:40 np0005591762 radosgw[81024]: mgrc service_daemon_register rgw.24184 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC 7763 64-Core Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.aqqfbf,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7865364,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=30fc5159-0993-4ff0-a95e-d1f2df875388,zone_name=default,zonegroup_id=466b069f-ae0a-4d3b-a92d-186e5cb7d7b9,zonegroup_name=default}
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  1: '-n'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  2: 'mgr.compute-2.bisona'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  3: '-f'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  4: '--setuser'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  5: 'ceph'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  6: '--setgroup'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  7: 'ceph'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  8: '--default-log-to-file=false'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  9: '--default-log-to-journald=true'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr respawn  exe_path /proc/self/exe
Jan 22 04:34:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setuser ceph since I am not root
Jan 22 04:34:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setgroup ceph since I am not root
Jan 22 04:34:40 np0005591762 systemd[1]: session-32.scope: Deactivated successfully.
Jan 22 04:34:40 np0005591762 systemd[1]: session-32.scope: Consumed 2.772s CPU time.
Jan 22 04:34:40 np0005591762 systemd-logind[744]: Session 32 logged out. Waiting for processes to exit.
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: pidfile_write: ignore empty --pid-file
Jan 22 04:34:40 np0005591762 systemd-logind[744]: Removed session 32.
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'alerts'
Jan 22 04:34:40 np0005591762 ceph-mon[75519]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 22 04:34:40 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.rgw.rgw.compute-2.aqqfbf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 22 04:34:40 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3397378284' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 22 04:34:40 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3397378284' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:40.944+0000 7f08891fe140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:40 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'balancer'
Jan 22 04:34:41 np0005591762 ceph-mgr[75802]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:41.016+0000 7f08891fe140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:41 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'cephadm'
Jan 22 04:34:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0)
Jan 22 04:34:41 np0005591762 ceph-mon[75519]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/613681825' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 22 04:34:41 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'crash'
Jan 22 04:34:41 np0005591762 ceph-mgr[75802]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:41.684+0000 7f08891fe140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:41 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'dashboard'
Jan 22 04:34:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:41 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/613681825' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 22 04:34:41 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'devicehealth'
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:42.229+0000 7f08891fe140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'diskprediction_local'
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]:  from numpy import show_config as show_numpy_config
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:42.370+0000 7f08891fe140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'influx'
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:42.433+0000 7f08891fe140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'insights'
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'iostat'
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:42.555+0000 7f08891fe140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'k8sevents'
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'localpool'
Jan 22 04:34:42 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mds_autoscaler'
Jan 22 04:34:42 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mirroring'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'nfs'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:43.416+0000 7f08891fe140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'orchestrator'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:43.606+0000 7f08891fe140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_perf_query'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:43.673+0000 7f08891fe140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_support'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:43.732+0000 7f08891fe140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'pg_autoscaler'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:43.801+0000 7f08891fe140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'progress'
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:43.864+0000 7f08891fe140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'prometheus'
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:44.164+0000 7f08891fe140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rbd_support'
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:44.249+0000 7f08891fe140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'restful'
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rgw'
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:44.628+0000 7f08891fe140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rook'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.118+0000 7f08891fe140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'selftest'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.180+0000 7f08891fe140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'snap_schedule'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.250+0000 7f08891fe140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'stats'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'status'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.380+0000 7f08891fe140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telegraf'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.442+0000 7f08891fe140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telemetry'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.576+0000 7f08891fe140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'test_orchestrator'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.767+0000 7f08891fe140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'volumes'
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:45.997+0000 7f08891fe140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'zabbix'
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:46.057+0000 7f08891fe140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: ms_deliver_dispatch: unhandled message 0x55d2c4753a00 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 22 04:34:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setuser ceph since I am not root
Jan 22 04:34:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setgroup ceph since I am not root
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: pidfile_write: ignore empty --pid-file
Jan 22 04:34:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'alerts'
Jan 22 04:34:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:46.239+0000 7f1f6c1d0140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'balancer'
Jan 22 04:34:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:46.309+0000 7f1f6c1d0140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'cephadm'
Jan 22 04:34:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'crash'
Jan 22 04:34:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:46.965+0000 7f1f6c1d0140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:34:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'dashboard'
Jan 22 04:34:47 np0005591762 ceph-mon[75519]: Active manager daemon compute-0.rfmoog restarted
Jan 22 04:34:47 np0005591762 ceph-mon[75519]: Activating manager daemon compute-0.rfmoog
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'devicehealth'
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:47.504+0000 7f1f6c1d0140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'diskprediction_local'
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]:  from numpy import show_config as show_numpy_config
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:47.644+0000 7f1f6c1d0140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'influx'
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:47.706+0000 7f1f6c1d0140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'insights'
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'iostat'
Jan 22 04:34:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:47.824+0000 7f1f6c1d0140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:34:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'k8sevents'
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'localpool'
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mds_autoscaler'
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mirroring'
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'nfs'
Jan 22 04:34:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:48.670+0000 7f1f6c1d0140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'orchestrator'
Jan 22 04:34:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:48.857+0000 7f1f6c1d0140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_perf_query'
Jan 22 04:34:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:48.924+0000 7f1f6c1d0140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_support'
Jan 22 04:34:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:48.982+0000 7f1f6c1d0140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:34:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'pg_autoscaler'
Jan 22 04:34:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:49.049+0000 7f1f6c1d0140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'progress'
Jan 22 04:34:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:49.111+0000 7f1f6c1d0140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'prometheus'
Jan 22 04:34:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:49.407+0000 7f1f6c1d0140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rbd_support'
Jan 22 04:34:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:49.492+0000 7f1f6c1d0140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'restful'
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rgw'
Jan 22 04:34:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:49.866+0000 7f1f6c1d0140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:34:49 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rook'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.347+0000 7f1f6c1d0140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'selftest'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.409+0000 7f1f6c1d0140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'snap_schedule'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.478+0000 7f1f6c1d0140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'stats'
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'status'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.607+0000 7f1f6c1d0140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telegraf'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.667+0000 7f1f6c1d0140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telemetry'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.800+0000 7f1f6c1d0140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'test_orchestrator'
Jan 22 04:34:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:50.990+0000 7f1f6c1d0140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:34:50 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'volumes'
Jan 22 04:34:51 np0005591762 systemd[1]: Stopping User Manager for UID 42477...
Jan 22 04:34:51 np0005591762 systemd[72496]: Activating special unit Exit the Session...
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped target Main User Target.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped target Basic System.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped target Paths.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped target Sockets.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped target Timers.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 04:34:51 np0005591762 systemd[72496]: Closed D-Bus User Message Bus Socket.
Jan 22 04:34:51 np0005591762 systemd[72496]: Stopped Create User's Volatile Files and Directories.
Jan 22 04:34:51 np0005591762 systemd[72496]: Removed slice User Application Slice.
Jan 22 04:34:51 np0005591762 systemd[72496]: Reached target Shutdown.
Jan 22 04:34:51 np0005591762 systemd[72496]: Finished Exit the Session.
Jan 22 04:34:51 np0005591762 systemd[72496]: Reached target Exit the Session.
Jan 22 04:34:51 np0005591762 systemd[1]: user@42477.service: Deactivated successfully.
Jan 22 04:34:51 np0005591762 systemd[1]: Stopped User Manager for UID 42477.
Jan 22 04:34:51 np0005591762 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 22 04:34:51 np0005591762 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 22 04:34:51 np0005591762 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 22 04:34:51 np0005591762 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 22 04:34:51 np0005591762 systemd[1]: Removed slice User Slice of UID 42477.
Jan 22 04:34:51 np0005591762 systemd[1]: user-42477.slice: Consumed 51.430s CPU time.
Jan 22 04:34:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:51.219+0000 7f1f6c1d0140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'zabbix'
Jan 22 04:34:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:34:51.281+0000 7f1f6c1d0140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: mgr load Constructed class from module: dashboard
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Starting engine...
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: ms_deliver_dispatch: unhandled message 0x55d7d8151860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 22 04:34:51 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Engine started...
Jan 22 04:34:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 22 04:34:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:51 np0005591762 systemd[1]: Created slice User Slice of UID 42477.
Jan 22 04:34:51 np0005591762 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 22 04:34:51 np0005591762 systemd-logind[744]: New session 33 of user ceph-admin.
Jan 22 04:34:51 np0005591762 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 22 04:34:51 np0005591762 systemd[1]: Starting User Manager for UID 42477...
Jan 22 04:34:51 np0005591762 systemd[82971]: Queued start job for default target Main User Target.
Jan 22 04:34:51 np0005591762 systemd[82971]: Created slice User Application Slice.
Jan 22 04:34:51 np0005591762 systemd[82971]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 04:34:51 np0005591762 systemd[82971]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 04:34:51 np0005591762 systemd[82971]: Reached target Paths.
Jan 22 04:34:51 np0005591762 systemd[82971]: Reached target Timers.
Jan 22 04:34:51 np0005591762 systemd[82971]: Starting D-Bus User Message Bus Socket...
Jan 22 04:34:51 np0005591762 systemd[82971]: Starting Create User's Volatile Files and Directories...
Jan 22 04:34:51 np0005591762 systemd[82971]: Listening on D-Bus User Message Bus Socket.
Jan 22 04:34:51 np0005591762 systemd[82971]: Reached target Sockets.
Jan 22 04:34:51 np0005591762 systemd[82971]: Finished Create User's Volatile Files and Directories.
Jan 22 04:34:51 np0005591762 systemd[82971]: Reached target Basic System.
Jan 22 04:34:51 np0005591762 systemd[82971]: Reached target Main User Target.
Jan 22 04:34:51 np0005591762 systemd[82971]: Startup finished in 83ms.
Jan 22 04:34:51 np0005591762 systemd[1]: Started User Manager for UID 42477.
Jan 22 04:34:51 np0005591762 systemd[1]: Started Session 33 of User ceph-admin.
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: Active manager daemon compute-0.rfmoog restarted
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: Activating manager daemon compute-0.rfmoog
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: Manager daemon compute-0.rfmoog is now available
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"}]: dispatch
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"}]: dispatch
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e2 new map
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e2 print_map#012e2#012btime 2026-01-22T09:34:52:434317+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:34:52.434283+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Jan 22 04:34:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 22 04:34:52 np0005591762 podman[83095]: 2026-01-22 09:34:52.463001527 +0000 UTC m=+0.041377081 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:34:52 np0005591762 podman[83095]: 2026-01-22 09:34:52.539946638 +0000 UTC m=+0.118322182 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:52] ENGINE Bus STARTING
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:52] ENGINE Serving on https://192.168.122.100:7150
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:52] ENGINE Client ('192.168.122.100', 34688) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:52] ENGINE Serving on http://192.168.122.100:8765
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:34:52] ENGINE Bus STARTED
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 22 04:34:53 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Adjusting osd_memory_target on compute-1 to 128.7M
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Unable to set osd_memory_target on compute-1 to 134966067: error parsing value: Value '134966067' is below minimum 939524096
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.conf
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.conf
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.conf
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:34:54 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:55 np0005591762 ceph-mon[75519]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 22 04:34:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:34:57 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:57 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:57 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:57 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:57 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:57 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:57 np0005591762 ceph-mon[75519]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 22 04:34:57 np0005591762 systemd[1]: Reloading.
Jan 22 04:34:57 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:34:57 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:34:57 np0005591762 systemd[1]: Starting Ceph node-exporter.compute-2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:34:58 np0005591762 bash[84416]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 22 04:34:58 np0005591762 bash[84416]: Getting image source signatures
Jan 22 04:34:58 np0005591762 bash[84416]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 22 04:34:58 np0005591762 bash[84416]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 22 04:34:58 np0005591762 bash[84416]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 22 04:34:58 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 22 04:34:58 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/1531131488' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 22 04:34:58 np0005591762 ceph-mon[75519]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 22 04:34:58 np0005591762 ceph-mon[75519]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 22 04:34:59 np0005591762 bash[84416]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 22 04:34:59 np0005591762 bash[84416]: Writing manifest to image destination
Jan 22 04:34:59 np0005591762 podman[84416]: 2026-01-22 09:34:59.03833936 +0000 UTC m=+1.048415520 container create 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:34:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c2a6837c7754a8872d3fa1e4ce5ea6564da6f56466acf2d4c1c21f4d87c5c35/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 22 04:34:59 np0005591762 podman[84416]: 2026-01-22 09:34:59.072977409 +0000 UTC m=+1.083053568 container init 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:34:59 np0005591762 podman[84416]: 2026-01-22 09:34:59.077070243 +0000 UTC m=+1.087146402 container start 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:34:59 np0005591762 bash[84416]: 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000
Jan 22 04:34:59 np0005591762 podman[84416]: 2026-01-22 09:34:59.0281107 +0000 UTC m=+1.038186880 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 22 04:34:59 np0005591762 systemd[1]: Started Ceph node-exporter.compute-2 for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.083Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.083Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.084Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.084Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=arp
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=bcache
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=bonding
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=cpu
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=dmi
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=edac
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=entropy
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=filefd
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=netclass
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=netdev
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=netstat
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=nfs
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=nvme
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=os
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=pressure
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=rapl
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=selinux
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=softnet
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=stat
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=textfile
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=time
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.085Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.086Z caller=node_exporter.go:117 level=info collector=uname
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.086Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.086Z caller=node_exporter.go:117 level=info collector=xfs
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.086Z caller=node_exporter.go:117 level=info collector=zfs
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.086Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 22 04:34:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2[84480]: ts=2026-01-22T09:34:59.086Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 22 04:34:59 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:59 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:59 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:59 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:34:59 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:35:01 np0005591762 ceph-mon[75519]: from='client.? 192.168.122.100:0/3300870308' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 22 04:35:01 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:02 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:02 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:02 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.kjnvpx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:35:02 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.kjnvpx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 22 04:35:02 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: Deploying daemon rgw.rgw.compute-1.kjnvpx on compute-1
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.kfoyhi", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.kfoyhi", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 22 04:35:03 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: Deploying daemon rgw.rgw.compute-0.kfoyhi on compute-0
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.zwrmjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 22 04:35:04 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.zwrmjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:05.007141799 +0000 UTC m=+0.029665927 container create f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_mcnulty, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:35:05 np0005591762 systemd[1]: Started libpod-conmon-f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2.scope.
Jan 22 04:35:05 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:05.064832599 +0000 UTC m=+0.087356726 container init f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_mcnulty, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:05.069379409 +0000 UTC m=+0.091903536 container start f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_mcnulty, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:05.070824492 +0000 UTC m=+0.093348620 container attach f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_mcnulty, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:35:05 np0005591762 hopeful_mcnulty[84587]: 167 167
Jan 22 04:35:05 np0005591762 systemd[1]: libpod-f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2.scope: Deactivated successfully.
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:05.085046269 +0000 UTC m=+0.107570396 container died f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_mcnulty, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:04.995098249 +0000 UTC m=+0.017622396 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:35:05 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c33b8317daca4ca41de9197d659596d809056842faf66d34f7bf45512ee67e7c-merged.mount: Deactivated successfully.
Jan 22 04:35:05 np0005591762 podman[84573]: 2026-01-22 09:35:05.102821032 +0000 UTC m=+0.125345158 container remove f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_mcnulty, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 22 04:35:05 np0005591762 systemd[1]: libpod-conmon-f8af2ae0898022d426e1a24e4d0493cbb878809a4dfc5f25a744651e7337e5b2.scope: Deactivated successfully.
Jan 22 04:35:05 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:05 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:35:05 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:05 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:05 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:05 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:35:05 np0005591762 systemd[1]: Starting Ceph mds.cephfs.compute-2.zwrmjl for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:35:05 np0005591762 ceph-mon[75519]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 22 04:35:05 np0005591762 ceph-mon[75519]: Deploying daemon mds.cephfs.compute-2.zwrmjl on compute-2
Jan 22 04:35:05 np0005591762 podman[84718]: 2026-01-22 09:35:05.697080128 +0000 UTC m=+0.027104229 container create 72398059a7170d4a3b7a9b45b7ffde8bad02042e72a8abd727d2d1dbf5f10cf0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mds-cephfs-compute-2-zwrmjl, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:35:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d30cf9fb6446d7a0d99bb5d18c956afb28d47a3a8cbbb09409e5864e3609b59/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:35:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d30cf9fb6446d7a0d99bb5d18c956afb28d47a3a8cbbb09409e5864e3609b59/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 04:35:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d30cf9fb6446d7a0d99bb5d18c956afb28d47a3a8cbbb09409e5864e3609b59/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 04:35:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d30cf9fb6446d7a0d99bb5d18c956afb28d47a3a8cbbb09409e5864e3609b59/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.zwrmjl supports timestamps until 2038 (0x7fffffff)
Jan 22 04:35:05 np0005591762 podman[84718]: 2026-01-22 09:35:05.740520828 +0000 UTC m=+0.070544929 container init 72398059a7170d4a3b7a9b45b7ffde8bad02042e72a8abd727d2d1dbf5f10cf0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mds-cephfs-compute-2-zwrmjl, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:35:05 np0005591762 podman[84718]: 2026-01-22 09:35:05.744480411 +0000 UTC m=+0.074504512 container start 72398059a7170d4a3b7a9b45b7ffde8bad02042e72a8abd727d2d1dbf5f10cf0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mds-cephfs-compute-2-zwrmjl, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:35:05 np0005591762 bash[84718]: 72398059a7170d4a3b7a9b45b7ffde8bad02042e72a8abd727d2d1dbf5f10cf0
Jan 22 04:35:05 np0005591762 podman[84718]: 2026-01-22 09:35:05.685742488 +0000 UTC m=+0.015766609 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:35:05 np0005591762 systemd[1]: Started Ceph mds.cephfs.compute-2.zwrmjl for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:35:05 np0005591762 ceph-mds[84734]: set uid:gid to 167:167 (ceph:ceph)
Jan 22 04:35:05 np0005591762 ceph-mds[84734]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 22 04:35:05 np0005591762 ceph-mds[84734]: main not setting numa affinity
Jan 22 04:35:05 np0005591762 ceph-mds[84734]: pidfile_write: ignore empty --pid-file
Jan 22 04:35:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mds-cephfs-compute-2-zwrmjl[84730]: starting mds.cephfs.compute-2.zwrmjl at 
Jan 22 04:35:05 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl Updating MDS map to version 2 from mon.1
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e3 new map
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e3 print_map#012e3#012btime 2026-01-22T09:35:06:688450+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:34:52.434283+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.zwrmjl{-1:24346} state up:standby seq 1 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl Updating MDS map to version 3 from mon.1
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl Monitors have assigned me to become a standby
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e4 new map
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e4 print_map#012e4#012btime 2026-01-22T09:35:06:694486+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:35:06.694480+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24346}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.zwrmjl{0:24346} state up:creating seq 1 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl Updating MDS map to version 4 from mon.1
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.4 handle_mds_map I am now mds.0.4
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x1
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x100
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x600
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x601
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x602
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x603
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x604
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x605
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x606
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x607
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x608
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.cache creating system inode with ino:0x609
Jan 22 04:35:06 np0005591762 ceph-mds[84734]: mds.0.4 creating_done
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xazhzz", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xazhzz", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: Deploying daemon mds.cephfs.compute-0.xazhzz on compute-0
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: daemon mds.cephfs.compute-2.zwrmjl assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: Cluster is now healthy
Jan 22 04:35:06 np0005591762 ceph-mon[75519]: daemon mds.cephfs.compute-2.zwrmjl is now active in filesystem cephfs as rank 0
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e5 new map
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e5 print_map#012e5#012btime 2026-01-22T09:35:07:697007+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:35:07.697005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24346}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24346 members: 24346#012[mds.cephfs.compute-2.zwrmjl{0:24346} state up:active seq 2 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xazhzz{-1:14616} state up:standby seq 1 addr [v2:192.168.122.100:6806/4190227067,v1:192.168.122.100:6807/4190227067] compat {c=[1],r=[1],i=[1fff]}]
Jan 22 04:35:07 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl Updating MDS map to version 5 from mon.1
Jan 22 04:35:07 np0005591762 ceph-mds[84734]: mds.0.4 handle_mds_map I am now mds.0.4
Jan 22 04:35:07 np0005591762 ceph-mds[84734]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 22 04:35:07 np0005591762 ceph-mds[84734]: mds.0.4 recovery_done -- successful recovery!
Jan 22 04:35:07 np0005591762 ceph-mds[84734]: mds.0.4 active_start
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e6 new map
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e6 print_map#012e6#012btime 2026-01-22T09:35:07:704688+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:35:07.697005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24346}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24346 members: 24346#012[mds.cephfs.compute-2.zwrmjl{0:24346} state up:active seq 2 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xazhzz{-1:14616} state up:standby seq 1 addr [v2:192.168.122.100:6806/4190227067,v1:192.168.122.100:6807/4190227067] compat {c=[1],r=[1],i=[1fff]}]
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.sqikyq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.sqikyq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 22 04:35:07 np0005591762 ceph-mon[75519]: Deploying daemon mds.cephfs.compute-1.sqikyq on compute-1
Jan 22 04:35:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e7 new map
Jan 22 04:35:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e7 print_map#012e7#012btime 2026-01-22T09:35:08:985015+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:35:07.697005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24346}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24346 members: 24346#012[mds.cephfs.compute-2.zwrmjl{0:24346} state up:active seq 2 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xazhzz{-1:14616} state up:standby seq 1 addr [v2:192.168.122.100:6806/4190227067,v1:192.168.122.100:6807/4190227067] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.sqikyq{-1:24347} state up:standby seq 1 addr [v2:192.168.122.101:6804/2295742283,v1:192.168.122.101:6805/2295742283] compat {c=[1],r=[1],i=[1fff]}]
Jan 22 04:35:09 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:09 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:09 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:09 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:09 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:09 np0005591762 ceph-mon[75519]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e8 new map
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e8 print_map#012e8#012btime 2026-01-22T09:35:11:082871+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:35:10.708017+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24346}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24346 members: 24346#012[mds.cephfs.compute-2.zwrmjl{0:24346} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xazhzz{-1:14616} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4190227067,v1:192.168.122.100:6807/4190227067] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.sqikyq{-1:24347} state up:standby seq 1 addr [v2:192.168.122.101:6804/2295742283,v1:192.168.122.101:6805/2295742283] compat {c=[1],r=[1],i=[1fff]}]
Jan 22 04:35:11 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl Updating MDS map to version 8 from mon.1
Jan 22 04:35:11 np0005591762 ceph-mds[84734]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Jan 22 04:35:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mds-cephfs-compute-2-zwrmjl[84730]: 2026-01-22T09:35:11.702+0000 7fd14bc3e640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Jan 22 04:35:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: Regenerating cephadm self-signed grafana TLS certificates
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: Deploying daemon grafana.compute-0 on compute-0
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e9 new map
Jan 22 04:35:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).mds e9 print_map#012e9#012btime 2026-01-22T09:35:12:468222+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-22T09:34:52.434283+0000#012modified#0112026-01-22T09:35:10.708017+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24346}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24346 members: 24346#012[mds.cephfs.compute-2.zwrmjl{0:24346} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/3709284713,v1:192.168.122.102:6805/3709284713] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xazhzz{-1:14616} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4190227067,v1:192.168.122.100:6807/4190227067] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.sqikyq{-1:24347} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2295742283,v1:192.168.122.101:6805/2295742283] compat {c=[1],r=[1],i=[1fff]}]
Jan 22 04:35:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:18 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:18 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:18 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:18 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:18 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:18 np0005591762 ceph-mon[75519]: Deploying daemon haproxy.rgw.default.compute-0.duivti on compute-0
Jan 22 04:35:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:22 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:24 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:24 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:24 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:24 np0005591762 ceph-mon[75519]: Deploying daemon haproxy.rgw.default.compute-2.czpvbf on compute-2
Jan 22 04:35:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000019s ======
Jan 22 04:35:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:24.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000019s
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.725096612 +0000 UTC m=+2.145176352 container create 78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1 (image=quay.io/ceph/haproxy:2.3, name=wizardly_bell)
Jan 22 04:35:25 np0005591762 systemd[1]: Started libpod-conmon-78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1.scope.
Jan 22 04:35:25 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.775121843 +0000 UTC m=+2.195201583 container init 78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1 (image=quay.io/ceph/haproxy:2.3, name=wizardly_bell)
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.780623724 +0000 UTC m=+2.200703464 container start 78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1 (image=quay.io/ceph/haproxy:2.3, name=wizardly_bell)
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.781854743 +0000 UTC m=+2.201934504 container attach 78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1 (image=quay.io/ceph/haproxy:2.3, name=wizardly_bell)
Jan 22 04:35:25 np0005591762 wizardly_bell[84950]: 0 0
Jan 22 04:35:25 np0005591762 systemd[1]: libpod-78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1.scope: Deactivated successfully.
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.784664639 +0000 UTC m=+2.204744380 container died 78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1 (image=quay.io/ceph/haproxy:2.3, name=wizardly_bell)
Jan 22 04:35:25 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c02d1ee2dc4ea7f6711aac284b402c62ac5b9485e157bee0290cb29f67b8a9d8-merged.mount: Deactivated successfully.
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.715013557 +0000 UTC m=+2.135093318 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 22 04:35:25 np0005591762 podman[84851]: 2026-01-22 09:35:25.802098521 +0000 UTC m=+2.222178261 container remove 78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1 (image=quay.io/ceph/haproxy:2.3, name=wizardly_bell)
Jan 22 04:35:25 np0005591762 systemd[1]: libpod-conmon-78aabb2aee38de144ac77c44e110234738c8503f7ae27b72e74f0a38db753ca1.scope: Deactivated successfully.
Jan 22 04:35:25 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:25 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:35:25 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:26 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:26 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:26 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:35:26 np0005591762 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.czpvbf for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:35:26 np0005591762 podman[85083]: 2026-01-22 09:35:26.408286611 +0000 UTC m=+0.027556351 container create e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:35:26 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c9d2ce370640bb7b2b64782f0d23c36f38a134a11038a147b47bc80738ab4f4/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 22 04:35:26 np0005591762 podman[85083]: 2026-01-22 09:35:26.444974272 +0000 UTC m=+0.064244033 container init e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:35:26 np0005591762 podman[85083]: 2026-01-22 09:35:26.448550173 +0000 UTC m=+0.067819913 container start e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:35:26 np0005591762 bash[85083]: e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e
Jan 22 04:35:26 np0005591762 podman[85083]: 2026-01-22 09:35:26.396847079 +0000 UTC m=+0.016116840 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 22 04:35:26 np0005591762 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.czpvbf for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:35:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf[85095]: [NOTICE] 021/093526 (2) : New worker #1 (4) forked
Jan 22 04:35:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:26.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 22 04:35:27 np0005591762 ceph-mon[75519]: Deploying daemon keepalived.rgw.default.compute-0.idkctu on compute-0
Jan 22 04:35:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:35:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:27.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:35:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:28.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:35:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:29.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:35:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:30.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:35:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:31.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:35:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:35:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:32.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:35:33 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:33 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:33 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:33 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 22 04:35:33 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 22 04:35:33 np0005591762 ceph-mon[75519]: Deploying daemon keepalived.rgw.default.compute-2.udkjbg on compute-2
Jan 22 04:35:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:33.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:34.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.773598762 +0000 UTC m=+2.832916296 container create f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd (image=quay.io/ceph/keepalived:2.2.4, name=admiring_swirles, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, description=keepalived for Ceph, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vcs-type=git, io.buildah.version=1.28.2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, release=1793, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 22 04:35:35 np0005591762 systemd[1]: Started libpod-conmon-f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd.scope.
Jan 22 04:35:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:35:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:35.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:35:35 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.824470094 +0000 UTC m=+2.883787638 container init f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd (image=quay.io/ceph/keepalived:2.2.4, name=admiring_swirles, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vcs-type=git, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, release=1793, description=keepalived for Ceph, version=2.2.4)
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.829753524 +0000 UTC m=+2.889071059 container start f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd (image=quay.io/ceph/keepalived:2.2.4, name=admiring_swirles, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., name=keepalived, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.830959585 +0000 UTC m=+2.890277118 container attach f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd (image=quay.io/ceph/keepalived:2.2.4, name=admiring_swirles, name=keepalived, distribution-scope=public, io.buildah.version=1.28.2, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, vcs-type=git, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.expose-services=, build-date=2023-02-22T09:23:20)
Jan 22 04:35:35 np0005591762 admiring_swirles[85270]: 0 0
Jan 22 04:35:35 np0005591762 systemd[1]: libpod-f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd.scope: Deactivated successfully.
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.833559095 +0000 UTC m=+2.892876629 container died f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd (image=quay.io/ceph/keepalived:2.2.4, name=admiring_swirles, com.redhat.component=keepalived-container, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, io.buildah.version=1.28.2, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Jan 22 04:35:35 np0005591762 systemd[1]: var-lib-containers-storage-overlay-882bff2d9b8c56cf739785f3a43e691fad5d940548e0f7db9e93c12c14aa2e34-merged.mount: Deactivated successfully.
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.848998155 +0000 UTC m=+2.908315690 container remove f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd (image=quay.io/ceph/keepalived:2.2.4, name=admiring_swirles, vcs-type=git, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 04:35:35 np0005591762 podman[85188]: 2026-01-22 09:35:35.764400475 +0000 UTC m=+2.823718029 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 22 04:35:35 np0005591762 systemd[1]: libpod-conmon-f62aefa7901b6abe1da8268289d52ef9e06e6cb16a24e078fb050fbc126061bd.scope: Deactivated successfully.
Jan 22 04:35:35 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:35 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:35:35 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:36 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:36 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:35:36 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:36 np0005591762 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.udkjbg for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:35:36 np0005591762 podman[85404]: 2026-01-22 09:35:36.44517102 +0000 UTC m=+0.027115943 container create 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, name=keepalived)
Jan 22 04:35:36 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f470811316f9d42b464f759de65a4d9c0b29f7d60a781bf6804b0190e83ae77/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:35:36 np0005591762 podman[85404]: 2026-01-22 09:35:36.475344716 +0000 UTC m=+0.057289659 container init 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, description=keepalived for Ceph, com.redhat.component=keepalived-container, distribution-scope=public, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 04:35:36 np0005591762 podman[85404]: 2026-01-22 09:35:36.479342879 +0000 UTC m=+0.061287811 container start 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.tags=Ceph keepalived, distribution-scope=public, architecture=x86_64, name=keepalived, vcs-type=git, build-date=2023-02-22T09:23:20)
Jan 22 04:35:36 np0005591762 bash[85404]: 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0
Jan 22 04:35:36 np0005591762 podman[85404]: 2026-01-22 09:35:36.43407436 +0000 UTC m=+0.016019293 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 22 04:35:36 np0005591762 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.udkjbg for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: Starting VRRP child process, pid=4
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: Startup complete
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: (VI_0) Entering BACKUP STATE (init)
Jan 22 04:35:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:36 2026: VRRP_Script(check_backend) succeeded
Jan 22 04:35:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:36.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:36 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:36 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:36 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:36 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:37 np0005591762 ceph-mon[75519]: Deploying daemon prometheus.compute-0 on compute-0
Jan 22 04:35:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:37.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:38.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:35:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:39.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:35:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:40 2026: (VI_0) Entering MASTER STATE
Jan 22 04:35:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:40 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 22 04:35:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:35:40 2026: (VI_0) Entering BACKUP STATE
Jan 22 04:35:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:40.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:41.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:42 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:42.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.104536) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074543104612, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 5605, "num_deletes": 258, "total_data_size": 16369745, "memory_usage": 17258416, "flush_reason": "Manual Compaction"}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074543122610, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10327702, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 5610, "table_properties": {"data_size": 10307325, "index_size": 12775, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6533, "raw_key_size": 62746, "raw_average_key_size": 24, "raw_value_size": 10256535, "raw_average_value_size": 3935, "num_data_blocks": 569, "num_entries": 2606, "num_filter_entries": 2606, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 1769074431, "file_creation_time": 1769074543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 18110 microseconds, and 12625 cpu microseconds.
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.122649) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10327702 bytes OK
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.122662) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.126317) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.126328) EVENT_LOG_v1 {"time_micros": 1769074543126325, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.126342) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16340128, prev total WAL file size 16340128, number of live WAL files 2.
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.128174) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323534' seq:0, type:0; will stop at (end)
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10085KB) 8(1648B)]
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074543128251, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10329350, "oldest_snapshot_seqno": -1}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 2351 keys, 10323892 bytes, temperature: kUnknown
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074543144909, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10323892, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10304162, "index_size": 12773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 5893, "raw_key_size": 59255, "raw_average_key_size": 25, "raw_value_size": 10256593, "raw_average_value_size": 4362, "num_data_blocks": 569, "num_entries": 2351, "num_filter_entries": 2351, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769074543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.145017) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10323892 bytes
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.145323) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 619.0 rd, 618.6 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(9.9, 0.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 2611, records dropped: 260 output_compression: NoCompression
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.145342) EVENT_LOG_v1 {"time_micros": 1769074543145331, "job": 4, "event": "compaction_finished", "compaction_time_micros": 16688, "compaction_time_cpu_micros": 13176, "output_level": 6, "num_output_files": 1, "total_output_size": 10323892, "num_input_records": 2611, "num_output_records": 2351, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074543146390, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074543146583, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:35:43.128106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:43 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  1: '-n'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  2: 'mgr.compute-2.bisona'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  3: '-f'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  4: '--setuser'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  5: 'ceph'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  6: '--setgroup'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  7: 'ceph'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  8: '--default-log-to-file=false'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  9: '--default-log-to-journald=true'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr respawn  exe_path /proc/self/exe
Jan 22 04:35:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setuser ceph since I am not root
Jan 22 04:35:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: ignoring --setgroup ceph since I am not root
Jan 22 04:35:43 np0005591762 systemd[1]: session-33.scope: Deactivated successfully.
Jan 22 04:35:43 np0005591762 systemd[1]: session-33.scope: Consumed 12.767s CPU time.
Jan 22 04:35:43 np0005591762 systemd-logind[744]: Session 33 logged out. Waiting for processes to exit.
Jan 22 04:35:43 np0005591762 systemd-logind[744]: Removed session 33.
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: pidfile_write: ignore empty --pid-file
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'alerts'
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'balancer'
Jan 22 04:35:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:43.717+0000 7f4526fa6140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:35:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:43.788+0000 7f4526fa6140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 22 04:35:43 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'cephadm'
Jan 22 04:35:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:43.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'crash'
Jan 22 04:35:44 np0005591762 ceph-mgr[75802]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:35:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'dashboard'
Jan 22 04:35:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:44.461+0000 7f4526fa6140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 22 04:35:44 np0005591762 ceph-mon[75519]: from='mgr.14496 192.168.122.100:0/2579283684' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 22 04:35:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:44.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:44 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'devicehealth'
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:45.009+0000 7f4526fa6140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'diskprediction_local'
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]:  from numpy import show_config as show_numpy_config
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:45.150+0000 7f4526fa6140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'influx'
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:45.214+0000 7f4526fa6140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'insights'
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'iostat'
Jan 22 04:35:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:45.333+0000 7f4526fa6140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'k8sevents'
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'localpool'
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mds_autoscaler'
Jan 22 04:35:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:35:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:45.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'mirroring'
Jan 22 04:35:45 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'nfs'
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.188+0000 7f4526fa6140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'orchestrator'
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.377+0000 7f4526fa6140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_perf_query'
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.444+0000 7f4526fa6140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'osd_support'
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.502+0000 7f4526fa6140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'pg_autoscaler'
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.570+0000 7f4526fa6140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'progress'
Jan 22 04:35:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:35:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:46.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.633+0000 7f4526fa6140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'prometheus'
Jan 22 04:35:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:46.931+0000 7f4526fa6140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 22 04:35:46 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rbd_support'
Jan 22 04:35:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:47.016+0000 7f4526fa6140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'restful'
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rgw'
Jan 22 04:35:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:47.392+0000 7f4526fa6140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'rook'
Jan 22 04:35:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:47.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:47.874+0000 7f4526fa6140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'selftest'
Jan 22 04:35:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:47.936+0000 7f4526fa6140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 22 04:35:47 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'snap_schedule'
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.006+0000 7f4526fa6140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'stats'
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'status'
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.133+0000 7f4526fa6140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telegraf'
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.195+0000 7f4526fa6140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'telemetry'
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.328+0000 7f4526fa6140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'test_orchestrator'
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.519+0000 7f4526fa6140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'volumes'
Jan 22 04:35:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:48.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.749+0000 7f4526fa6140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Loading python module 'zabbix'
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 2026-01-22T09:35:48.810+0000 7f4526fa6140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr load Constructed class from module: dashboard
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: mgr load Constructed class from module: prometheus
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: ms_deliver_dispatch: unhandled message 0x55c1b2d27860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Starting engine...
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus INFO root] Starting engine...
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: [22/Jan/2026:09:35:48] ENGINE Bus STARTING
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus INFO cherrypy.error] [22/Jan/2026:09:35:48] ENGINE Bus STARTING
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: CherryPy Checker:
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: The Application mounted at '' has an empty config.
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: 
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [dashboard INFO root] Engine started...
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: [22/Jan/2026:09:35:48] ENGINE Serving on http://:::9283
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus INFO cherrypy.error] [22/Jan/2026:09:35:48] ENGINE Serving on http://:::9283
Jan 22 04:35:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mgr-compute-2-bisona[75798]: [22/Jan/2026:09:35:48] ENGINE Bus STARTED
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus INFO cherrypy.error] [22/Jan/2026:09:35:48] ENGINE Bus STARTED
Jan 22 04:35:48 np0005591762 ceph-mgr[75802]: [prometheus INFO root] Engine started.
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 22 04:35:49 np0005591762 systemd-logind[744]: New session 35 of user ceph-admin.
Jan 22 04:35:49 np0005591762 systemd[1]: Started Session 35 of User ceph-admin.
Jan 22 04:35:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:49.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: Active manager daemon compute-0.rfmoog restarted
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: Activating manager daemon compute-0.rfmoog
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: Manager daemon compute-0.rfmoog is now available
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"}]: dispatch
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/mirror_snapshot_schedule"}]: dispatch
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"}]: dispatch
Jan 22 04:35:49 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.rfmoog/trash_purge_schedule"}]: dispatch
Jan 22 04:35:50 np0005591762 podman[85594]: 2026-01-22 09:35:50.293203278 +0000 UTC m=+0.038895558 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:35:50 np0005591762 podman[85594]: 2026-01-22 09:35:50.369927554 +0000 UTC m=+0.115619854 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:35:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:50.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:50 np0005591762 podman[85703]: 2026-01-22 09:35:50.713578758 +0000 UTC m=+0.035211095 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:35:50 np0005591762 podman[85703]: 2026-01-22 09:35:50.720590942 +0000 UTC m=+0.042223279 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:35:50 np0005591762 podman[85774]: 2026-01-22 09:35:50.909435747 +0000 UTC m=+0.035798801 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:35:50 np0005591762 podman[85774]: 2026-01-22 09:35:50.912050437 +0000 UTC m=+0.038413491 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:35:51 np0005591762 podman[85828]: 2026-01-22 09:35:51.049471472 +0000 UTC m=+0.034285313 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-type=git, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., architecture=x86_64)
Jan 22 04:35:51 np0005591762 podman[85828]: 2026-01-22 09:35:51.056619963 +0000 UTC m=+0.041433804 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.buildah.version=1.28.2, release=1793, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, name=keepalived, build-date=2023-02-22T09:23:20, distribution-scope=public, vendor=Red Hat, Inc., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64)
Jan 22 04:35:51 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:35:50] ENGINE Bus STARTING
Jan 22 04:35:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:51.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:35:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:52.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:35:50] ENGINE Serving on http://192.168.122.100:8765
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:35:50] ENGINE Serving on https://192.168.122.100:7150
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:35:50] ENGINE Bus STARTED
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: [22/Jan/2026:09:35:50] ENGINE Client ('192.168.122.100', 42564) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:53.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.conf
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.conf
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.conf
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:35:53 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.conf
Jan 22 04:35:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:54.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: Updating compute-2:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: Updating compute-1:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: Updating compute-0:/var/lib/ceph/43df7a30-cf5f-5209-adfd-bf44298b19f2/config/ceph.client.admin.keyring
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.pszzrs", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.pszzrs", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.pszzrs", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.pszzrs-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:35:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.pszzrs-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:35:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:55.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Failed to apply ingress.nfs.cephfs spec IngressSpec.from_json(yaml.safe_load('''service_type: ingress#012service_id: nfs.cephfs#012service_name: ingress.nfs.cephfs#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012spec:#012  backend_service: nfs.cephfs#012  enable_haproxy_protocol: true#012  first_virtual_router_id: 50#012  frontend_port: 2049#012  monitor_port: 9049#012  virtual_ip: 192.168.122.2/24#012''')): max() arg is an empty sequence#012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 602, in _apply_all_services#012    if self._apply_service(spec):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 947, in _apply_service#012    daemon_spec = svc.prepare_create(daemon_spec)#012  File "/usr/share/ceph/mgr/cephadm/services/ingress.py", line 46, in prepare_create#012    return self.haproxy_prepare_create(daemon_spec)#012  File "/usr/share/ceph/mgr/cephadm/services/ingress.py", line 74, in haproxy_prepare_create#012    daemon_spec.final_config, daemon_spec.deps = self.haproxy_generate_config(daemon_spec)#012  File "/usr/share/ceph/mgr/cephadm/services/ingress.py", line 139, in haproxy_generate_config#012    num_ranks = 1 + max(by_rank.keys())#012ValueError: max() arg is an empty sequence
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Creating key for client.nfs.cephfs.0.0.compute-1.pszzrs
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Rados config object exists: conf-nfs.cephfs
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Creating key for client.nfs.cephfs.0.0.compute-1.pszzrs-rgw
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.pszzrs-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Bind address in nfs.cephfs.0.0.compute-1.pszzrs's ganesha conf is defaulting to empty
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Deploying daemon nfs.cephfs.0.0.compute-1.pszzrs on compute-1
Jan 22 04:35:55 np0005591762 ceph-mon[75519]: Health check failed: Failed to apply 1 service(s): ingress.nfs.cephfs (CEPHADM_APPLY_SPEC_FAIL)
Jan 22 04:35:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:56.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: Creating key for client.nfs.cephfs.1.0.compute-2.qniaxp
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.qniaxp", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.qniaxp", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.qniaxp", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 22 04:35:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 22 04:35:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:57.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:35:58.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.690711656 +0000 UTC m=+0.027163713 container create 474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_antonelli, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:35:59 np0005591762 systemd[1]: Started libpod-conmon-474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6.scope.
Jan 22 04:35:59 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.741375186 +0000 UTC m=+0.077827243 container init 474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_antonelli, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.747203182 +0000 UTC m=+0.083655239 container start 474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_antonelli, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.748505262 +0000 UTC m=+0.084957319 container attach 474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_antonelli, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:35:59 np0005591762 clever_antonelli[86992]: 167 167
Jan 22 04:35:59 np0005591762 systemd[1]: libpod-474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6.scope: Deactivated successfully.
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.751266487 +0000 UTC m=+0.087718545 container died 474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_antonelli, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:35:59 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c402c1ed7ec0c4b7d127b1e394471d4fd5167874e29048bcf040027661ddb633-merged.mount: Deactivated successfully.
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.678770156 +0000 UTC m=+0.015222223 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:35:59 np0005591762 podman[86979]: 2026-01-22 09:35:59.781998524 +0000 UTC m=+0.118450580 container remove 474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_antonelli, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:35:59 np0005591762 systemd[1]: libpod-conmon-474aa824f2cded4f42e5dfabecb35f3348827d334bb37843050e91fa8d0b07f6.scope: Deactivated successfully.
Jan 22 04:35:59 np0005591762 systemd[1]: Reloading.
Jan 22 04:35:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:35:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:35:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:35:59.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:35:59 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:35:59 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:36:00 np0005591762 systemd[1]: Reloading.
Jan 22 04:36:00 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:36:00 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:36:00 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: Rados config object exists: conf-nfs.cephfs
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: Creating key for client.nfs.cephfs.1.0.compute-2.qniaxp-rgw
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.qniaxp-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.qniaxp-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.qniaxp-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: Bind address in nfs.cephfs.1.0.compute-2.qniaxp's ganesha conf is defaulting to empty
Jan 22 04:36:00 np0005591762 ceph-mon[75519]: Deploying daemon nfs.cephfs.1.0.compute-2.qniaxp on compute-2
Jan 22 04:36:00 np0005591762 podman[87122]: 2026-01-22 09:36:00.402638056 +0000 UTC m=+0.026420725 container create 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 22 04:36:00 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/775b89649c08bc74313081ba8794beae129ab390be1891918b3bb31aca319fa0/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:36:00 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/775b89649c08bc74313081ba8794beae129ab390be1891918b3bb31aca319fa0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:36:00 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/775b89649c08bc74313081ba8794beae129ab390be1891918b3bb31aca319fa0/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:36:00 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/775b89649c08bc74313081ba8794beae129ab390be1891918b3bb31aca319fa0/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:36:00 np0005591762 podman[87122]: 2026-01-22 09:36:00.449539927 +0000 UTC m=+0.073322616 container init 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 22 04:36:00 np0005591762 podman[87122]: 2026-01-22 09:36:00.453311253 +0000 UTC m=+0.077093923 container start 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 22 04:36:00 np0005591762 bash[87122]: 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca
Jan 22 04:36:00 np0005591762 podman[87122]: 2026-01-22 09:36:00.39191578 +0000 UTC m=+0.015698469 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:36:00 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:36:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:36:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:00.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: Creating key for client.nfs.cephfs.2.0.compute-0.ylzmiu
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ylzmiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ylzmiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ylzmiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 22 04:36:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:01.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:02.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:03.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ylzmiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ylzmiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ylzmiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 22 04:36:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:04.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: Rados config object exists: conf-nfs.cephfs
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: Creating key for client.nfs.cephfs.2.0.compute-0.ylzmiu-rgw
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: Bind address in nfs.cephfs.2.0.compute-0.ylzmiu's ganesha conf is defaulting to empty
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: Deploying daemon nfs.cephfs.2.0.compute-0.ylzmiu on compute-0
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:05 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:36:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:05.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:36:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:36:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:06.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:07.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:08.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:09 np0005591762 podman[87350]: 2026-01-22 09:36:09.542597149 +0000 UTC m=+0.040150078 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 22 04:36:09 np0005591762 podman[87350]: 2026-01-22 09:36:09.620206251 +0000 UTC m=+0.117759180 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 22 04:36:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:09.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:09 np0005591762 podman[87460]: 2026-01-22 09:36:09.976547471 +0000 UTC m=+0.034148475 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:36:09 np0005591762 podman[87460]: 2026-01-22 09:36:09.984645409 +0000 UTC m=+0.042246403 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:36:10 np0005591762 podman[87531]: 2026-01-22 09:36:10.172604828 +0000 UTC m=+0.034071892 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:36:10 np0005591762 podman[87531]: 2026-01-22 09:36:10.181649076 +0000 UTC m=+0.043116119 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:36:10 np0005591762 podman[87584]: 2026-01-22 09:36:10.31973895 +0000 UTC m=+0.035968621 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, version=2.2.4, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, release=1793, description=keepalived for Ceph, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Jan 22 04:36:10 np0005591762 podman[87584]: 2026-01-22 09:36:10.329593553 +0000 UTC m=+0.045823224 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, io.buildah.version=1.28.2, vcs-type=git, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, name=keepalived, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 22 04:36:10 np0005591762 podman[87625]: 2026-01-22 09:36:10.43391157 +0000 UTC m=+0.034689765 container exec 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 22 04:36:10 np0005591762 podman[87625]: 2026-01-22 09:36:10.443594029 +0000 UTC m=+0.044372223 container exec_died 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 04:36:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:10.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:11.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:12 np0005591762 systemd-logind[744]: New session 36 of user zuul.
Jan 22 04:36:12 np0005591762 systemd[1]: Started Session 36 of User zuul.
Jan 22 04:36:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:12 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:36:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:12.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:12 np0005591762 python3.9[87807]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:36:13 np0005591762 ceph-mon[75519]: Deploying daemon haproxy.nfs.cephfs.compute-1.zxzfsl on compute-1
Jan 22 04:36:13 np0005591762 ceph-mon[75519]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 1 service(s): ingress.nfs.cephfs)
Jan 22 04:36:13 np0005591762 ceph-mon[75519]: Cluster is now healthy
Jan 22 04:36:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:13.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:14 np0005591762 python3.9[88021]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:36:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:14.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:15.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:16 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:16 np0005591762 ceph-mon[75519]: Deploying daemon haproxy.nfs.cephfs.compute-0.dnpemq on compute-0
Jan 22 04:36:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:16.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.44566021 +0000 UTC m=+0.028442338 container create 3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434 (image=quay.io/ceph/haproxy:2.3, name=hardcore_shaw)
Jan 22 04:36:17 np0005591762 systemd[1]: Started libpod-conmon-3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434.scope.
Jan 22 04:36:17 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.504485957 +0000 UTC m=+0.087268106 container init 3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434 (image=quay.io/ceph/haproxy:2.3, name=hardcore_shaw)
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.510335354 +0000 UTC m=+0.093117483 container start 3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434 (image=quay.io/ceph/haproxy:2.3, name=hardcore_shaw)
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.511463276 +0000 UTC m=+0.094245405 container attach 3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434 (image=quay.io/ceph/haproxy:2.3, name=hardcore_shaw)
Jan 22 04:36:17 np0005591762 hardcore_shaw[88139]: 0 0
Jan 22 04:36:17 np0005591762 systemd[1]: libpod-3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434.scope: Deactivated successfully.
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.51404812 +0000 UTC m=+0.096830249 container died 3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434 (image=quay.io/ceph/haproxy:2.3, name=hardcore_shaw)
Jan 22 04:36:17 np0005591762 systemd[1]: var-lib-containers-storage-overlay-fafda6bb9c484f4cbaa66a47456f3066bb6c35b76a9d90aac17f9ec8c3308a03-merged.mount: Deactivated successfully.
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.433562647 +0000 UTC m=+0.016344786 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 22 04:36:17 np0005591762 podman[88126]: 2026-01-22 09:36:17.533392919 +0000 UTC m=+0.116175048 container remove 3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434 (image=quay.io/ceph/haproxy:2.3, name=hardcore_shaw)
Jan 22 04:36:17 np0005591762 systemd[1]: libpod-conmon-3019e059524ac1db07e6cb1efe7c103800c1613f21f847d37f2d4757fb06b434.scope: Deactivated successfully.
Jan 22 04:36:17 np0005591762 systemd[1]: Reloading.
Jan 22 04:36:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:17 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:36:17 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:36:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:17 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc004460 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:17 np0005591762 systemd[1]: Reloading.
Jan 22 04:36:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:17.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:17 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:36:17 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:36:18 np0005591762 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.uczfqf for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:36:18 np0005591762 podman[88280]: 2026-01-22 09:36:18.188008131 +0000 UTC m=+0.036657037 container create 14c7e84943f8ce4ef32aa27e6dfe9a982ec28e980ffe1d470948af013c3462cb (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf)
Jan 22 04:36:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104001ad0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:18 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86a9c14d32000147d680172b5f448ba5908e36e958dc4d40100ad88c8d2b97cd/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 22 04:36:18 np0005591762 podman[88280]: 2026-01-22 09:36:18.22565098 +0000 UTC m=+0.074299906 container init 14c7e84943f8ce4ef32aa27e6dfe9a982ec28e980ffe1d470948af013c3462cb (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf)
Jan 22 04:36:18 np0005591762 podman[88280]: 2026-01-22 09:36:18.229128524 +0000 UTC m=+0.077777430 container start 14c7e84943f8ce4ef32aa27e6dfe9a982ec28e980ffe1d470948af013c3462cb (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf)
Jan 22 04:36:18 np0005591762 bash[88280]: 14c7e84943f8ce4ef32aa27e6dfe9a982ec28e980ffe1d470948af013c3462cb
Jan 22 04:36:18 np0005591762 podman[88280]: 2026-01-22 09:36:18.175070977 +0000 UTC m=+0.023719903 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 22 04:36:18 np0005591762 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.uczfqf for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:36:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [NOTICE] 021/093618 (2) : New worker #1 (4) forked
Jan 22 04:36:18 np0005591762 ceph-mon[75519]: Deploying daemon haproxy.nfs.cephfs.compute-2.uczfqf on compute-2
Jan 22 04:36:18 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:18 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:18 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:18 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:18.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104001ad0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:19 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 22 04:36:19 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 22 04:36:19 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 22 04:36:19 np0005591762 ceph-mon[75519]: Deploying daemon keepalived.nfs.cephfs.compute-1.bcudmx on compute-1
Jan 22 04:36:19 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:19 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104001ad0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:19.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc004460 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:20.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc004460 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:21 np0005591762 systemd[1]: session-36.scope: Deactivated successfully.
Jan 22 04:36:21 np0005591762 systemd[1]: session-36.scope: Consumed 6.364s CPU time.
Jan 22 04:36:21 np0005591762 systemd-logind[744]: Session 36 logged out. Waiting for processes to exit.
Jan 22 04:36:21 np0005591762 systemd-logind[744]: Removed session 36.
Jan 22 04:36:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:21 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104008f60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:21.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104008f60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:22.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8001f80 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 22 04:36:23 np0005591762 ceph-mon[75519]: Deploying daemon keepalived.nfs.cephfs.compute-0.qtywyd on compute-0
Jan 22 04:36:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:23 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc005560 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:23.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104009e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.298141296 +0000 UTC m=+0.025395786 container create 553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be (image=quay.io/ceph/keepalived:2.2.4, name=elegant_khayyam, build-date=2023-02-22T09:23:20, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, distribution-scope=public, version=2.2.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.openshift.expose-services=)
Jan 22 04:36:24 np0005591762 systemd[1]: Started libpod-conmon-553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be.scope.
Jan 22 04:36:24 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.353385074 +0000 UTC m=+0.080639585 container init 553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be (image=quay.io/ceph/keepalived:2.2.4, name=elegant_khayyam, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, version=2.2.4, io.buildah.version=1.28.2, architecture=x86_64, io.openshift.tags=Ceph keepalived, distribution-scope=public, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.358140101 +0000 UTC m=+0.085394592 container start 553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be (image=quay.io/ceph/keepalived:2.2.4, name=elegant_khayyam, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.buildah.version=1.28.2, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, vcs-type=git, io.openshift.tags=Ceph keepalived, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, distribution-scope=public)
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.359124183 +0000 UTC m=+0.086378674 container attach 553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be (image=quay.io/ceph/keepalived:2.2.4, name=elegant_khayyam, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, architecture=x86_64, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, name=keepalived)
Jan 22 04:36:24 np0005591762 elegant_khayyam[88435]: 0 0
Jan 22 04:36:24 np0005591762 systemd[1]: libpod-553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be.scope: Deactivated successfully.
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.361666937 +0000 UTC m=+0.088921449 container died 553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be (image=quay.io/ceph/keepalived:2.2.4, name=elegant_khayyam, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.buildah.version=1.28.2, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 22 04:36:24 np0005591762 systemd[1]: var-lib-containers-storage-overlay-06ab0c76732c3d9fc0c76b8f49405289e9234f3bc49218772f7649ebea87ea95-merged.mount: Deactivated successfully.
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.382057596 +0000 UTC m=+0.109312086 container remove 553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be (image=quay.io/ceph/keepalived:2.2.4, name=elegant_khayyam, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, io.openshift.expose-services=, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, version=2.2.4, vendor=Red Hat, Inc., description=keepalived for Ceph)
Jan 22 04:36:24 np0005591762 podman[88422]: 2026-01-22 09:36:24.288393084 +0000 UTC m=+0.015647595 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 22 04:36:24 np0005591762 systemd[1]: libpod-conmon-553c132fb37b750249ee77c1f8d86ad0c9af3f113b41d9924f60ac80aec5f0be.scope: Deactivated successfully.
Jan 22 04:36:24 np0005591762 systemd[1]: Reloading.
Jan 22 04:36:24 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:36:24 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:36:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:24.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:24 np0005591762 systemd[1]: Reloading.
Jan 22 04:36:24 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:36:24 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:36:24 np0005591762 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.bromuh for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:36:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104009e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 22 04:36:24 np0005591762 ceph-mon[75519]: Deploying daemon keepalived.nfs.cephfs.compute-2.bromuh on compute-2
Jan 22 04:36:25 np0005591762 podman[88567]: 2026-01-22 09:36:25.003386795 +0000 UTC m=+0.024513747 container create 04c4e36b36327a3a0f51d1a84f99296ac52577934666c03fef588bdc6924d6ee (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, release=1793, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived)
Jan 22 04:36:25 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e15852807bbdff92b4f082f0944aa57922e873307ff238384957133cfa757e8/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:36:25 np0005591762 podman[88567]: 2026-01-22 09:36:25.039629309 +0000 UTC m=+0.060756281 container init 04c4e36b36327a3a0f51d1a84f99296ac52577934666c03fef588bdc6924d6ee (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, vendor=Red Hat, Inc., release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 22 04:36:25 np0005591762 podman[88567]: 2026-01-22 09:36:25.043120408 +0000 UTC m=+0.064247360 container start 04c4e36b36327a3a0f51d1a84f99296ac52577934666c03fef588bdc6924d6ee (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64)
Jan 22 04:36:25 np0005591762 bash[88567]: 04c4e36b36327a3a0f51d1a84f99296ac52577934666c03fef588bdc6924d6ee
Jan 22 04:36:25 np0005591762 podman[88567]: 2026-01-22 09:36:24.993768216 +0000 UTC m=+0.014895188 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 22 04:36:25 np0005591762 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.bromuh for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Failed to bind to process monitoring socket - errno 98 - Address already in use
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Starting VRRP child process, pid=4
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: Startup complete
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: (VI_0) Entering BACKUP STATE (init)
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: VRRP_Script(check_backend) succeeded
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:25 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8002aa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000007s ======
Jan 22 04:36:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:25.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000007s
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:36:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc005560 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:26.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104009e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.130893) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074587130929, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1627, "num_deletes": 252, "total_data_size": 6412791, "memory_usage": 6645584, "flush_reason": "Manual Compaction"}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074587138454, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3999227, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 5615, "largest_seqno": 7237, "table_properties": {"data_size": 3992612, "index_size": 3429, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 17108, "raw_average_key_size": 20, "raw_value_size": 3977702, "raw_average_value_size": 4850, "num_data_blocks": 154, "num_entries": 820, "num_filter_entries": 820, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074543, "oldest_key_time": 1769074543, "file_creation_time": 1769074587, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 7580 microseconds, and 5933 cpu microseconds.
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.138476) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3999227 bytes OK
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.138486) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.139438) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.139455) EVENT_LOG_v1 {"time_micros": 1769074587139452, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.139463) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 6404618, prev total WAL file size 6404618, number of live WAL files 2.
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.140268) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3905KB)], [15(10081KB)]
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074587140297, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 14323119, "oldest_snapshot_seqno": -1}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 2630 keys, 12957675 bytes, temperature: kUnknown
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074587162787, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12957675, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12936087, "index_size": 13927, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6597, "raw_key_size": 66622, "raw_average_key_size": 25, "raw_value_size": 12883351, "raw_average_value_size": 4898, "num_data_blocks": 617, "num_entries": 2630, "num_filter_entries": 2630, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769074587, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.163041) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12957675 bytes
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.164953) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 632.4 rd, 572.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 9.8 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(6.8) write-amplify(3.2) OK, records in: 3171, records dropped: 541 output_compression: NoCompression
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.164967) EVENT_LOG_v1 {"time_micros": 1769074587164961, "job": 6, "event": "compaction_finished", "compaction_time_micros": 22649, "compaction_time_cpu_micros": 16516, "output_level": 6, "num_output_files": 1, "total_output_size": 12957675, "num_input_records": 3171, "num_output_records": 2630, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074587165856, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074587167249, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.140222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.167361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.167364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.167366) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.167367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:36:27 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:36:27.167368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:36:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:27 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104009e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:27.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104009e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:28.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:29 np0005591762 podman[88746]: 2026-01-22 09:36:29.488545657 +0000 UTC m=+0.036789696 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:36:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:29 np0005591762 podman[88746]: 2026-01-22 09:36:29.570172087 +0000 UTC m=+0.118416146 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:36:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:29 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:29.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:29 np0005591762 podman[88860]: 2026-01-22 09:36:29.900768296 +0000 UTC m=+0.033157832 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:36:29 np0005591762 podman[88860]: 2026-01-22 09:36:29.907645716 +0000 UTC m=+0.040035232 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:36:30 np0005591762 podman[88930]: 2026-01-22 09:36:30.086357776 +0000 UTC m=+0.032591956 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:36:30 np0005591762 podman[88930]: 2026-01-22 09:36:30.095653297 +0000 UTC m=+0.041887458 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:36:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:30 np0005591762 podman[88983]: 2026-01-22 09:36:30.228452314 +0000 UTC m=+0.033650148 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vendor=Red Hat, Inc., release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived)
Jan 22 04:36:30 np0005591762 podman[88983]: 2026-01-22 09:36:30.237601791 +0000 UTC m=+0.042799625 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1793, description=keepalived for Ceph, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20)
Jan 22 04:36:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:30 np0005591762 podman[89024]: 2026-01-22 09:36:30.337643131 +0000 UTC m=+0.031126339 container exec 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 22 04:36:30 np0005591762 podman[89042]: 2026-01-22 09:36:30.395539429 +0000 UTC m=+0.044562410 container exec_died 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:36:30 np0005591762 podman[89024]: 2026-01-22 09:36:30.398560935 +0000 UTC m=+0.092044153 container exec_died 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 22 04:36:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400b990 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:31 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:31.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:32.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400b990 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:32 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:32 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:32 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:36:32 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:32 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:32 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:36:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:33 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400b990 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:33.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:34.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:35 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:35.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:36 np0005591762 systemd-logind[744]: New session 37 of user zuul.
Jan 22 04:36:36 np0005591762 systemd[1]: Started Session 37 of User zuul.
Jan 22 04:36:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:36.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006b90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:36 np0005591762 ceph-mon[75519]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Jan 22 04:36:36 np0005591762 ceph-mon[75519]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Jan 22 04:36:36 np0005591762 python3.9[89265]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 22 04:36:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:37 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:36:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:37.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:36:37 np0005591762 python3.9[89440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:36:38 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:38 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:38 np0005591762 ceph-mon[75519]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 22 04:36:38 np0005591762 ceph-mon[75519]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 22 04:36:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:38 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 22 04:36:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:38.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 22 04:36:38 np0005591762 python3.9[89597]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:36:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:38 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:39 np0005591762 ceph-mon[75519]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 22 04:36:39 np0005591762 ceph-mon[75519]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 22 04:36:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:39 np0005591762 python3.9[89751]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:36:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:39 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006b90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:39.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.068972887 +0000 UTC m=+0.031133115 container create cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_ishizaka, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 22 04:36:40 np0005591762 systemd[1]: Started libpod-conmon-cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16.scope.
Jan 22 04:36:40 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.12272691 +0000 UTC m=+0.084887138 container init cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_ishizaka, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.128221798 +0000 UTC m=+0.090382026 container start cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_ishizaka, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.129275394 +0000 UTC m=+0.091435622 container attach cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_ishizaka, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 22 04:36:40 np0005591762 beautiful_ishizaka[89970]: 167 167
Jan 22 04:36:40 np0005591762 systemd[1]: libpod-cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16.scope: Deactivated successfully.
Jan 22 04:36:40 np0005591762 conmon[89970]: conmon cddd06056a62d5d1f08e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16.scope/container/memory.events
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.133020703 +0000 UTC m=+0.095180932 container died cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_ishizaka, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 22 04:36:40 np0005591762 systemd[1]: var-lib-containers-storage-overlay-77a9b09225363e8d6094a30237fedbde2acca0f5b93d32fc0a6ff093cbedaef6-merged.mount: Deactivated successfully.
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.149845988 +0000 UTC m=+0.112006216 container remove cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_ishizaka, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:36:40 np0005591762 podman[89917]: 2026-01-22 09:36:40.054100153 +0000 UTC m=+0.016260412 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:36:40 np0005591762 systemd[1]: libpod-conmon-cddd06056a62d5d1f08e5ce103933b854c7415403e1d2df60b1d23bcf44d6c16.scope: Deactivated successfully.
Jan 22 04:36:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:40 np0005591762 python3.9[89986]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:36:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:36:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:40.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: Reconfiguring rgw.rgw.compute-2.aqqfbf (unknown last config time)...
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aqqfbf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aqqfbf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: Reconfiguring daemon rgw.rgw.compute-2.aqqfbf on compute-2
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 22 04:36:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:40 np0005591762 podman[90256]: 2026-01-22 09:36:40.772660311 +0000 UTC m=+0.044846984 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:36:40 np0005591762 python3.9[90244]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:36:40 np0005591762 podman[90256]: 2026-01-22 09:36:40.846679613 +0000 UTC m=+0.118866285 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:36:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:41 np0005591762 podman[90465]: 2026-01-22 09:36:41.210258235 +0000 UTC m=+0.035989158 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:36:41 np0005591762 podman[90465]: 2026-01-22 09:36:41.214956741 +0000 UTC m=+0.040687664 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:36:41 np0005591762 podman[90593]: 2026-01-22 09:36:41.416001767 +0000 UTC m=+0.038638752 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:36:41 np0005591762 podman[90593]: 2026-01-22 09:36:41.42562134 +0000 UTC m=+0.048258315 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:36:41 np0005591762 python3.9[90566]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:36:41 np0005591762 network[90647]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:36:41 np0005591762 network[90649]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:36:41 np0005591762 network[90652]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:36:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:41 np0005591762 podman[90669]: 2026-01-22 09:36:41.57148349 +0000 UTC m=+0.039213025 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, name=keepalived, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vcs-type=git, version=2.2.4)
Jan 22 04:36:41 np0005591762 podman[90669]: 2026-01-22 09:36:41.574665067 +0000 UTC m=+0.042394603 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, distribution-scope=public, release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9)
Jan 22 04:36:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:41 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:41.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:42 np0005591762 podman[90715]: 2026-01-22 09:36:42.094308721 +0000 UTC m=+0.036590381 container exec 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:36:42 np0005591762 podman[90715]: 2026-01-22 09:36:42.109934505 +0000 UTC m=+0.052216165 container exec_died 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:36:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:42 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006b90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:36:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:42.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:36:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:42 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:36:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:36:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:43.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:36:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:44 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:44 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:36:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:44.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:44 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006b90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:45 np0005591762 python3.9[91020]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:36:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:45 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc006b90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:45.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:45 np0005591762 python3.9[91171]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:36:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:46 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80033c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:46.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:46 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:47 np0005591762 python3.9[91326]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:36:47 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:47 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:36:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:47 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:47.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:48 np0005591762 python3.9[91512]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:36:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:48 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/093648 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:36:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:36:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:48.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:36:48 np0005591762 python3.9[91596]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:36:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:48 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5114003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:49 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:49 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 22 04:36:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:49 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:49.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:50 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51080027a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:50 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 22 04:36:50 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:50 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:50 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:50 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:50 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:50.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:50 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51080027a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:51 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 22 04:36:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:51 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:51.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:52 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 22 04:36:52 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 47 pg[3.0( empty local-lis/les=26/27 n=0 ec=13/13 lis/c=26/26 les/c/f=27/27/0 sis=47 pruub=9.694473267s) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active pruub 165.248138428s@ mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:36:52 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 47 pg[3.0( empty local-lis/les=26/27 n=0 ec=13/13 lis/c=26/26 les/c/f=27/27/0 sis=47 pruub=9.694473267s) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown pruub 165.248138428s@ mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:52 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:52.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:52 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51080027a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1e( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1d( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1c( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.b( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.a( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.9( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.8( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.7( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.5( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.4( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.3( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1f( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.2( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.6( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.c( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.d( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.e( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.f( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.10( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.11( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.12( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.13( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.14( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.15( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.16( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.17( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.18( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.19( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1a( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1b( empty local-lis/les=26/27 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1d( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1c( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.b( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1e( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.8( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.7( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.9( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.5( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.4( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.a( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1f( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.2( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.6( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.0( empty local-lis/les=47/48 n=0 ec=13/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.e( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.d( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.f( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.10( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.11( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.12( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.13( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.14( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.15( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.16( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.18( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.3( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.19( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1a( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.17( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.1b( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 48 pg[3.c( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=26/26 les/c/f=27/27/0 sis=47) [2] r=0 lpr=47 pi=[26,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Jan 22 04:36:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Jan 22 04:36:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:53 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:53.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:54 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Jan 22 04:36:54 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Jan 22 04:36:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:54 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:54 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 22 04:36:54 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 49 pg[5.0( empty local-lis/les=26/27 n=0 ec=16/16 lis/c=26/26 les/c/f=27/27/0 sis=49 pruub=15.668217659s) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active pruub 173.248260498s@ mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:36:54 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 49 pg[5.0( empty local-lis/les=26/27 n=0 ec=16/16 lis/c=26/26 les/c/f=27/27/0 sis=49 pruub=15.668217659s) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown pruub 173.248260498s@ mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:36:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:54.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:36:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:54 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Jan 22 04:36:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Jan 22 04:36:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:55 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:55 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1c( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1f( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1e( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.11( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1d( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.10( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.13( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.12( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.15( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.14( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.17( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.16( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.9( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.8( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.b( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.a( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.6( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.7( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.4( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.5( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.2( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.3( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.e( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.f( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.c( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.d( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1a( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1b( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.18( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1c( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.19( empty local-lis/les=26/27 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1f( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1e( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.11( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1d( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.12( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.15( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.14( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.16( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.9( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.17( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.13( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.b( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.a( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.6( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.0( empty local-lis/les=49/50 n=0 ec=16/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.7( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.4( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.8( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.10( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.2( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.3( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.5( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.e( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.d( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1a( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.1b( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.c( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.18( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.19( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 50 pg[5.f( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=26/26 les/c/f=27/27/0 sis=49) [2] r=0 lpr=49 pi=[26,49)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:36:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:55 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108003be0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:36:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:55.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:36:56 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.b scrub starts
Jan 22 04:36:56 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.b scrub ok
Jan 22 04:36:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:56 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108003be0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 22 04:36:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:36:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:56.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:36:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:36:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:56 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:57 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:36:57 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Jan 22 04:36:57 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Jan 22 04:36:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:57 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:57 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 22 04:36:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:57 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:36:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:57.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:36:58 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Jan 22 04:36:58 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Jan 22 04:36:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:58 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51100027a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:36:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000019s ======
Jan 22 04:36:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:36:58.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000019s
Jan 22 04:36:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:58 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108003be0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:59 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Jan 22 04:36:59 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Jan 22 04:36:59 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:36:59 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:59 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:36:59 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:59 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 22 04:36:59 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 22 04:36:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:36:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:36:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:36:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:36:59 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:36:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:36:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:36:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:36:59.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:37:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:37:00 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Jan 22 04:37:00 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Jan 22 04:37:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400ce00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:00 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 22 04:37:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 22 04:37:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:00 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:00 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:00.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:00 np0005591762 systemd[82971]: Starting Mark boot as successful...
Jan 22 04:37:00 np0005591762 systemd[82971]: Finished Mark boot as successful.
Jan 22 04:37:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51100032c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:01 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Jan 22 04:37:01 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Jan 22 04:37:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:37:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 22 04:37:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 22 04:37:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:01 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108004f30 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:01.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:02 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Jan 22 04:37:02 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Jan 22 04:37:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:02 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108004f30 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:37:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:02.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:02 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:03 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Jan 22 04:37:03 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Jan 22 04:37:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:03 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:03.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:04 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.a scrub starts
Jan 22 04:37:04 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.a scrub ok
Jan 22 04:37:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:04 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:04 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:37:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:04.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:04 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108004f30 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:05 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Jan 22 04:37:05 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Jan 22 04:37:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:05 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5110003be0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:05.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1 deep-scrub starts
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1 deep-scrub ok
Jan 22 04:37:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1d( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901328087s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.596237183s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1d( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901303291s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.596237183s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1f( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901113510s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.596221924s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1f( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901097298s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.596221924s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1c( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.899165154s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.594345093s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1c( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.899141312s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.594345093s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.19( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866537094s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561767578s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.19( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866527557s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561767578s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1e( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.900926590s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.596237183s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1e( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.900917053s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.596237183s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.18( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866438866s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561767578s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.18( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866430283s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561767578s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.17( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866470337s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561874390s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.17( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866460800s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561874390s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.11( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.900816917s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.596237183s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.11( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.900808334s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.596237183s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.16( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866244316s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561767578s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.10( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.903139114s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598678589s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.16( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866234779s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561767578s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.10( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.903128624s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598678589s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.14( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866106033s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561737061s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.14( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866097450s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561737061s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.13( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866073608s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561737061s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.13( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866063118s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561737061s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.15( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.902223587s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.597961426s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.15( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.902215958s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.597961426s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.12( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865991592s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561737061s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.12( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865982056s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561737061s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.17( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.902300835s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598144531s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.14( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.902147293s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.597961426s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.14( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901960373s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.597961426s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.10( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865616798s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561706543s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.10( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865606308s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561706543s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.16( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901845932s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598022461s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.16( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901836395s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598022461s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.f( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865365982s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561645508s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.9( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901834488s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598114014s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.9( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901823044s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598114014s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.f( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865351677s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561645508s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.d( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865260124s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561645508s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.d( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.865252495s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561645508s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.c( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.867000580s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.563507080s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.a( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901667595s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598190308s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.c( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.866989136s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.563507080s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.a( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901654243s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598190308s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.6( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901576996s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598190308s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.6( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901567459s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598190308s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.6( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864909172s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561553955s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.6( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864899635s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561553955s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864841461s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561538696s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864832878s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561538696s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.7( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901538849s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598266602s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.7( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901522636s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598266602s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.2( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864703178s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561523438s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.2( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864692688s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561523438s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.3( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864809036s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561721802s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.3( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864800453s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561721802s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.5( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901765823s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598754883s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.5( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901758194s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598754883s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.4( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864466667s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561523438s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.4( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864452362s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561523438s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.2( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901521683s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598678589s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.17( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901000023s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598144531s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.2( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901512146s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598678589s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.5( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864176750s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561431885s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.3( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901429176s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598678589s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.5( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864166260s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561431885s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.3( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901420593s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598678589s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.7( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864084244s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561416626s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901410103s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598739624s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.7( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.864074707s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561416626s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901401520s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598739624s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.f( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901308060s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598754883s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.f( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901300430s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598754883s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.c( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901202202s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598739624s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.c( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901193619s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598739624s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.b( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863698006s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561325073s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.b( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863681793s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561325073s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1c( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863627434s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561309814s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1c( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863614082s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561309814s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1b( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901092529s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.598831177s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.1b( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901082039s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.598831177s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1e( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863510132s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561325073s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.18( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901325226s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.599151611s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.18( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901315689s) [1] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.599151611s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1e( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863499641s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561325073s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1f( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863599777s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561508179s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.1f( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.863591194s) [0] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561508179s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.19( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901348114s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 active pruub 182.599288940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[5.19( empty local-lis/les=49/50 n=0 ec=49/16 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.901339531s) [0] r=-1 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.599288940s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.1d( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.11( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.16( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.16( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.19( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.a( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.862797737s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active pruub 180.561569214s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[3.a( empty local-lis/les=47/48 n=0 ec=47/13 lis/c=47/47 les/c/f=48/48/0 sis=57 pruub=10.862784386s) [1] r=-1 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 180.561569214s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.15( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.1c( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.13( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.1f( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.5( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.b( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.8( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.6( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.a( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.7( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.a( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.3( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.f( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.3( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.1( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.9( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.3( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.1( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.e( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.f( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.d( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.3( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.2( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.8( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.c( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.5( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.b( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.9( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.6( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.d( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[6.9( empty local-lis/les=0/0 n=0 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.1d( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.1c( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.1b( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.1f( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.12( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.11( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.17( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.f( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.9( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.13( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.5( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.5( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.1( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.a( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.4( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.b( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.3( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.c( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.4( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.7( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.2( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.d( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.3( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.a( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.f( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.1e( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.10( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.12( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.16( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.13( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.1d( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.11( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.1a( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.15( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.1e( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.18( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[2.18( empty local-lis/les=0/0 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.14( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[10.10( empty local-lis/les=0/0 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[7.1d( empty local-lis/les=0/0 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.2( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.1c( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.14( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[8.1f( empty local-lis/les=0/0 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[4.15( empty local-lis/les=0/0 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.17( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[11.19( empty local-lis/les=0/0 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 57 pg[12.11( empty local-lis/les=0/0 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:06.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.1c( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.11( v 54'58 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.16( v 43'2 lc 0'0 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.15( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.17( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.16( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.13( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.1d( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.13( v 54'58 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.1c( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.b( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.3( v 54'99 lc 43'84 (0'0,54'99] local-lis/les=57/58 n=1 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=54'99 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.f( v 43'42 lc 42'1 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.3( v 29'6 (0'0,29'6] local-lis/les=57/58 n=1 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.1d( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.1( v 43'96 (0'0,43'96] local-lis/les=57/58 n=1 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.d( v 43'42 lc 42'13 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.7( v 54'58 lc 0'0 (0'0,54'58] local-lis/les=57/58 n=1 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.c( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.2( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.3( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.1( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.5( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.f( v 29'6 lc 0'0 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.7( v 43'42 lc 42'21 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.a( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.9( v 29'6 lc 0'0 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.a( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.5( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.6( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.3( v 43'42 lc 0'0 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.1( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.e( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.d( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.f( v 43'96 (0'0,43'96] local-lis/les=57/58 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.9( v 54'58 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.b( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.5( v 43'42 lc 42'11 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.8( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.d( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.3( v 54'58 (0'0,54'58] local-lis/les=57/58 n=1 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.2( v 29'6 (0'0,29'6] local-lis/les=57/58 n=1 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.c( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.4( v 43'96 (0'0,43'96] local-lis/les=57/58 n=1 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.2( v 54'58 (0'0,54'58] local-lis/les=57/58 n=1 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.5( v 29'6 (0'0,29'6] local-lis/les=57/58 n=1 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.9( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.a( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.f( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.8( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.14( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.19( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.6( v 29'6 lc 0'0 (0'0,29'6] local-lis/les=57/58 n=1 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.15( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.10( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.1d( v 54'58 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[6.b( v 43'42 lc 0'0 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=49/49 les/c/f=51/51/0 sis=57) [2] r=0 lpr=57 pi=[49,57)/1 crt=43'42 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.13( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.14( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.12( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.1e( v 54'58 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.1f( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.15( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.16( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.1a( v 56'61 lc 54'57 (0'0,56'61] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=56'61 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.18( v 54'58 lc 43'19 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.11( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.1c( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.1f( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.1e( v 43'96 (0'0,43'96] local-lis/les=57/58 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[4.19( empty local-lis/les=57/58 n=0 ec=47/15 lis/c=47/47 les/c/f=48/48/0 sis=57) [2] r=0 lpr=57 pi=[47,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.11( v 43'96 (0'0,43'96] local-lis/les=57/58 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.1d( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.1b( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[8.11( v 29'6 (0'0,29'6] local-lis/les=57/58 n=0 ec=51/28 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=29'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[7.1f( empty local-lis/les=57/58 n=0 ec=51/18 lis/c=51/51 les/c/f=52/52/0 sis=57) [2] r=0 lpr=57 pi=[51,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.12( v 43'96 (0'0,43'96] local-lis/les=57/58 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.17( v 54'58 (0'0,54'58] local-lis/les=57/58 n=0 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.18( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[10.10( v 43'96 (0'0,43'96] local-lis/les=57/58 n=0 ec=53/32 lis/c=53/53 les/c/f=54/54/0 sis=57) [2] r=0 lpr=57 pi=[53,57)/1 crt=43'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[2.a( empty local-lis/les=57/58 n=0 ec=45/12 lis/c=45/45 les/c/f=46/46/0 sis=57) [2] r=0 lpr=57 pi=[45,57)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[12.4( v 54'58 (0'0,54'58] local-lis/les=57/58 n=1 ec=55/40 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=54'58 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 58 pg[11.3( v 43'2 (0'0,43'2] local-lis/les=57/58 n=0 ec=55/34 lis/c=55/55 les/c/f=56/56/0 sis=57) [2] r=0 lpr=57 pi=[55,57)/1 crt=43'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:07 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108004f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:07.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:08 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5110003be0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:08 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 22 04:37:08 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 22 04:37:08 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 22 04:37:08 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 22 04:37:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 22 04:37:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:08.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:08 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 22 04:37:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 22 04:37:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 22 04:37:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:09 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:09.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:10 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 22 04:37:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971851349s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 43'42 active pruub 186.711303711s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971710205s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711303711s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971446991s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 43'42 active pruub 186.711517334s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971376419s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711517334s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971332550s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 43'42 active pruub 186.711608887s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971276283s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711608887s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971303940s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 43'42 active pruub 186.711975098s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=12.971244812s) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711975098s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.17( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.7( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.3( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 22 04:37:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 22 04:37:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 22 04:37:10 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 61 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:10.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:10 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5110003be0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 22 04:37:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.17( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.17( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.3( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.3( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.7( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.7( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 22 04:37:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 22 04:37:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:11 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:11.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:12 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1a scrub starts
Jan 22 04:37:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:12 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:12 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1a scrub ok
Jan 22 04:37:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 22 04:37:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 22 04:37:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:12.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:12 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:13 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 22 04:37:13 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 22 04:37:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:13 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5110004f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:13.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341123581s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 43'42 active pruub 186.711380005s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341254234s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 43'42 active pruub 186.711715698s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:14 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Jan 22 04:37:14 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Jan 22 04:37:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 22 04:37:14 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:14.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:14 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:15 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Jan 22 04:37:15 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Jan 22 04:37:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:15 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:15.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 22 04:37:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:16 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5110004f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:16.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:16 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=54'1164 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:17 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:17 np0005591762 ceph-mon[75519]: Health check failed: Degraded data redundancy: 2/230 objects degraded (0.870%), 1 pg degraded (PG_DEGRADED)
Jan 22 04:37:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:17 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:17.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:18 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 22 04:37:18 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:18 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:18 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:18 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:18 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 22 04:37:18 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 22 04:37:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/093718 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:37:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:37:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:18.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:19 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 22 04:37:19 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 22 04:37:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:19 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:19 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:19.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:20 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 22 04:37:20 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 22 04:37:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:37:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:20.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:21 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Jan 22 04:37:21 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Jan 22 04:37:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:21 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:21.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:22 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Jan 22 04:37:22 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Jan 22 04:37:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 22 04:37:22 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 22 04:37:22 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 22 04:37:22 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 22 04:37:22 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 22 04:37:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:37:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:22.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:23 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 22 04:37:23 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Jan 22 04:37:23 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Jan 22 04:37:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:23 np0005591762 ceph-mon[75519]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 2/230 objects degraded (0.870%), 1 pg degraded)
Jan 22 04:37:23 np0005591762 ceph-mon[75519]: Cluster is now healthy
Jan 22 04:37:23 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 22 04:37:23 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 22 04:37:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:23 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:24 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.e scrub starts
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 22 04:37:24 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.e scrub ok
Jan 22 04:37:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 22 04:37:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 22 04:37:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:24.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:25 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Jan 22 04:37:25 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Jan 22 04:37:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:25 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 22 04:37:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:25 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:26 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.b scrub starts
Jan 22 04:37:26 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.b scrub ok
Jan 22 04:37:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 22 04:37:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 22 04:37:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 22 04:37:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 22 04:37:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 22 04:37:26 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:26 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:26.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:27 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.0 deep-scrub starts
Jan 22 04:37:27 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.0 deep-scrub ok
Jan 22 04:37:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:27 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 22 04:37:27 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 22 04:37:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 22 04:37:27 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:27 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:27 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:27 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:27 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80024b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:28 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts
Jan 22 04:37:28 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok
Jan 22 04:37:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:28 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 22 04:37:28 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 22 04:37:28 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 22 04:37:28 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 22 04:37:28 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 22 04:37:28 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806510925s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 active pruub 202.711914062s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:28 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:28.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:28 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:28 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Jan 22 04:37:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/093729 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:37:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 22 04:37:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:29 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:29 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:29.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:30 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 22 04:37:30 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 22 04:37:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80024b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:30.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 22 04:37:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 22 04:37:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 22 04:37:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 22 04:37:30 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 22 04:37:30 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:30 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:31 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.d scrub starts
Jan 22 04:37:31 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.d scrub ok
Jan 22 04:37:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 22 04:37:31 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:31 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:31 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:31 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 22 04:37:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 22 04:37:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:31 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:32 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Jan 22 04:37:32 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Jan 22 04:37:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:32.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 22 04:37:32 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:32 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:33 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Jan 22 04:37:33 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Jan 22 04:37:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 22 04:37:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:33 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80024b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:33.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 22 04:37:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 22 04:37:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 22 04:37:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:34 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:35 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Jan 22 04:37:35 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Jan 22 04:37:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:35 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:35.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:36 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 22 04:37:36 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 22 04:37:36 np0005591762 python3.9[92025]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:37:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8002650 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:36.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 22 04:37:36 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 22 04:37:36 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 22 04:37:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 22 04:37:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 22 04:37:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc007850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:37 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.17 deep-scrub starts
Jan 22 04:37:37 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.17 deep-scrub ok
Jan 22 04:37:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:37 np0005591762 python3.9[92314]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 22 04:37:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:37 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 22 04:37:37 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 22 04:37:37 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 22 04:37:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:37 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:37:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:37.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:38 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Jan 22 04:37:38 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Jan 22 04:37:38 np0005591762 python3.9[92466]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 22 04:37:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:38 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:38 np0005591762 python3.9[92619]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:37:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:38.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:38 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 22 04:37:38 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 22 04:37:38 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 22 04:37:38 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 22 04:37:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 22 04:37:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:38 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8003f90 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:39 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.b scrub starts
Jan 22 04:37:39 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.b scrub ok
Jan 22 04:37:39 np0005591762 python3.9[92772]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 22 04:37:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:39 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00d3a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 22 04:37:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 22 04:37:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:39.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:40 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.3 deep-scrub starts
Jan 22 04:37:40 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.3 deep-scrub ok
Jan 22 04:37:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:40 np0005591762 python3.9[92924]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:37:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:40.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 22 04:37:40 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 22 04:37:40 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 22 04:37:40 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 22 04:37:40 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 22 04:37:40 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374924660s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 active pruub 213.398483276s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:40 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374703407s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 active pruub 213.398498535s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:40 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:40 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:37:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:37:40 np0005591762 python3.9[93077]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:37:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:41 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 22 04:37:41 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 22 04:37:41 np0005591762 python3.9[93156]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:37:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:41 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8003f90 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:41 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 22 04:37:41 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 22 04:37:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 22 04:37:41 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:41 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:41 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:41 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:41.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:42 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 22 04:37:42 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 22 04:37:42 np0005591762 python3.9[93308]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:37:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:42 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00b340 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:42.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:42 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 22 04:37:42 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 22 04:37:42 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 22 04:37:42 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 22 04:37:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 22 04:37:42 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:42 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:42 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:43 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Jan 22 04:37:43 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Jan 22 04:37:43 np0005591762 python3.9[93463]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 22 04:37:43 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676091194s) [1] async=[1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 43'1161 active pruub 222.056365967s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:43 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:43 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671806335s) [1] async=[1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 43'1161 active pruub 222.052169800s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:43 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:43 np0005591762 python3.9[93617]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 22 04:37:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 22 04:37:43 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 22 04:37:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:37:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:43.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:44 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 22 04:37:44 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 22 04:37:44 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 22 04:37:44 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988652229s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 active pruub 217.372222900s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:44 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:44 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988155365s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 active pruub 217.372268677s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:44 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:44 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8003f90 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:44 np0005591762 python3.9[93770]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 04:37:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:44.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 22 04:37:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 22 04:37:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:44 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:45 np0005591762 python3.9[93923]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 22 04:37:45 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 22 04:37:45 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 22 04:37:45 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 22 04:37:45 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:45 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:45 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:45 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:37:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:45 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:45 np0005591762 python3.9[94076]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:37:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:45.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:46 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 22 04:37:46 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 22 04:37:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 22 04:37:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:46 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:46 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:46 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:37:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:37:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:46.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:46 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8003f90 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:47 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Jan 22 04:37:47 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Jan 22 04:37:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 22 04:37:47 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137292862s) [1] async=[1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 43'1161 active pruub 225.539581299s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:47 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:47 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135436058s) [1] async=[1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 43'1161 active pruub 225.538360596s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:37:47 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:37:47 np0005591762 python3.9[94231]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:37:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:47 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:47.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:47 np0005591762 python3.9[94451]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:37:48 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Jan 22 04:37:48 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Jan 22 04:37:48 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 22 04:37:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:48 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:48 np0005591762 python3.9[94541]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:37:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:48.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:48 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:48 np0005591762 python3.9[94694]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:37:49 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Jan 22 04:37:49 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Jan 22 04:37:49 np0005591762 python3.9[94773]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:37:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/093749 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:37:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:49 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:49.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:50 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 22 04:37:50 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 22 04:37:50 np0005591762 python3.9[94950]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:37:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:50 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:50 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:51 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.d scrub starts
Jan 22 04:37:51 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.d scrub ok
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:37:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:51 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:51.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:52 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 22 04:37:52 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 22 04:37:52 np0005591762 python3.9[95103]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:37:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 22 04:37:52 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 22 04:37:52 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 22 04:37:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:52 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5108005850 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:52 np0005591762 python3.9[95256]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 22 04:37:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:52 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:53 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 22 04:37:53 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 22 04:37:53 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 22 04:37:53 np0005591762 python3.9[95407]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:37:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:53 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:53.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:54 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 22 04:37:54 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 22 04:37:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 22 04:37:54 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 22 04:37:54 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 22 04:37:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:54 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:54 np0005591762 python3.9[95560]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:37:54 np0005591762 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 22 04:37:54 np0005591762 systemd[1]: tuned.service: Deactivated successfully.
Jan 22 04:37:54 np0005591762 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 22 04:37:54 np0005591762 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 04:37:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:54.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:54 np0005591762 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 04:37:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:54 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f510400d8a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:55 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.3 deep-scrub starts
Jan 22 04:37:55 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.3 deep-scrub ok
Jan 22 04:37:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 22 04:37:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:37:55 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 22 04:37:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:55 np0005591762 python3.9[95748]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 22 04:37:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:55 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c4d60 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:37:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:55.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:37:56 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Jan 22 04:37:56 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Jan 22 04:37:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 22 04:37:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:56 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118006770 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:56.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:37:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:56 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:57 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.c scrub starts
Jan 22 04:37:57 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.c scrub ok
Jan 22 04:37:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 22 04:37:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:57 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:57.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:37:58 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.4 deep-scrub starts
Jan 22 04:37:58 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.4 deep-scrub ok
Jan 22 04:37:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 22 04:37:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:58 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c4d60 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:58 np0005591762 python3.9[95902]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:37:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:37:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:37:58.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:37:58 np0005591762 python3.9[96057]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:37:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:58 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118006770 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:59 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 22 04:37:59 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 22 04:37:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:37:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:37:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:37:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:37:59 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:37:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:37:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:37:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:37:59.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:00 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.a deep-scrub starts
Jan 22 04:38:00 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.a deep-scrub ok
Jan 22 04:38:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:00 np0005591762 systemd[1]: session-37.scope: Deactivated successfully.
Jan 22 04:38:00 np0005591762 systemd[1]: session-37.scope: Consumed 48.304s CPU time.
Jan 22 04:38:00 np0005591762 systemd-logind[744]: Session 37 logged out. Waiting for processes to exit.
Jan 22 04:38:00 np0005591762 systemd-logind[744]: Removed session 37.
Jan 22 04:38:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:00.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:01 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Jan 22 04:38:01 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Jan 22 04:38:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:01 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118007480 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:38:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:01.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:02 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Jan 22 04:38:02 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Jan 22 04:38:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:02 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 22 04:38:02 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 22 04:38:02 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 22 04:38:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:02.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:02 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 22 04:38:03 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.d deep-scrub starts
Jan 22 04:38:03 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.d deep-scrub ok
Jan 22 04:38:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 22 04:38:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:03 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:03.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 22 04:38:04 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 22 04:38:04 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 22 04:38:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:04 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118007600 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 22 04:38:04 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 22 04:38:04 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 22 04:38:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:04.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:04 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c6c00 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:05 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Jan 22 04:38:05 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Jan 22 04:38:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 22 04:38:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:05 np0005591762 systemd-logind[744]: New session 38 of user zuul.
Jan 22 04:38:05 np0005591762 systemd[1]: Started Session 38 of User zuul.
Jan 22 04:38:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:05 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:05.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:06 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Jan 22 04:38:06 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Jan 22 04:38:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 22 04:38:06 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 22 04:38:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 22 04:38:06 np0005591762 python3.9[96244]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:06.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:38:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:07 np0005591762 python3.9[96402]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 22 04:38:07 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Jan 22 04:38:07 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Jan 22 04:38:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 22 04:38:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:07 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:07.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:08 np0005591762 python3.9[96555]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:38:08 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts
Jan 22 04:38:08 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.14 deep-scrub ok
Jan 22 04:38:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:08 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118007f20 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:08 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 22 04:38:08 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 22 04:38:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 22 04:38:08 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827784538s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 active pruub 245.395080566s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:08 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:08 np0005591762 python3.9[96640]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 04:38:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:08.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:08 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:09 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 22 04:38:09 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 22 04:38:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 22 04:38:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 22 04:38:09 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:09 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:09 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:09.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:10 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.14 deep-scrub starts
Jan 22 04:38:10 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.14 deep-scrub ok
Jan 22 04:38:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:10 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 22 04:38:10 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 22 04:38:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 22 04:38:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:10 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:38:10 np0005591762 python3.9[96819]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:38:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:10.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:38:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:10 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118008840 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:11 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Jan 22 04:38:11 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Jan 22 04:38:11 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 22 04:38:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 22 04:38:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992759705s) [1] async=[1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 43'1161 active pruub 249.603393555s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:11 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:11 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:38:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:11.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:12 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1f deep-scrub starts
Jan 22 04:38:12 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1f deep-scrub ok
Jan 22 04:38:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:12 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:12 np0005591762 python3.9[96974]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:38:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 22 04:38:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:38:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:12.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:38:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:12 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:13 np0005591762 python3.9[97128]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:13 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 22 04:38:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:13 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:13 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 22 04:38:13 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 22 04:38:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:13 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118009160 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:13 np0005591762 python3.9[97281]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 22 04:38:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:38:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:13.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:38:14 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Jan 22 04:38:14 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 22 04:38:14 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:38:14 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Jan 22 04:38:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:14 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:14 np0005591762 python3.9[97431]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:14.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:14 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:15 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 22 04:38:15 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 22 04:38:15 np0005591762 python3.9[97591]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:15 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:15.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:16 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Jan 22 04:38:16 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Jan 22 04:38:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:16 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118009160 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 22 04:38:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 22 04:38:16 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 22 04:38:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:16.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 22 04:38:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:16 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:17 np0005591762 python3.9[97745]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:38:17 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1c deep-scrub starts
Jan 22 04:38:17 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1c deep-scrub ok
Jan 22 04:38:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 22 04:38:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:17 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:17.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:18 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Jan 22 04:38:18 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Jan 22 04:38:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:18 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 22 04:38:18 np0005591762 python3.9[98033]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 22 04:38:18 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 22 04:38:18 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 22 04:38:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:18.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118009160 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:19 np0005591762 python3.9[98184]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:38:19 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 22 04:38:19 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 22 04:38:19 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 22 04:38:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:19 np0005591762 python3.9[98339]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:19 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:19.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:20 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.11 deep-scrub starts
Jan 22 04:38:20 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.11 deep-scrub ok
Jan 22 04:38:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:20 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 22 04:38:20 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256608009s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 active pruub 251.942764282s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:20 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:20 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 22 04:38:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 22 04:38:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:20.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:21 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 22 04:38:21 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 22 04:38:21 np0005591762 python3.9[98494]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 22 04:38:21 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:21 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.493573) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074701493601, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 3408, "num_deletes": 251, "total_data_size": 8177843, "memory_usage": 8346608, "flush_reason": "Manual Compaction"}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074701503113, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5176700, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7242, "largest_seqno": 10645, "table_properties": {"data_size": 5160903, "index_size": 10179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4549, "raw_key_size": 42388, "raw_average_key_size": 23, "raw_value_size": 5125676, "raw_average_value_size": 2825, "num_data_blocks": 441, "num_entries": 1814, "num_filter_entries": 1814, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074588, "oldest_key_time": 1769074588, "file_creation_time": 1769074701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 9567 microseconds, and 7022 cpu microseconds.
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.503138) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5176700 bytes OK
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.503156) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.503440) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.503451) EVENT_LOG_v1 {"time_micros": 1769074701503448, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.503460) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8160840, prev total WAL file size 8160840, number of live WAL files 2.
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.504493) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5055KB)], [18(12MB)]
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074701504528, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18134375, "oldest_snapshot_seqno": -1}
Jan 22 04:38:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3913 keys, 14129796 bytes, temperature: kUnknown
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074701535517, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14129796, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14098158, "index_size": 20776, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9797, "raw_key_size": 99412, "raw_average_key_size": 25, "raw_value_size": 14020909, "raw_average_value_size": 3583, "num_data_blocks": 898, "num_entries": 3913, "num_filter_entries": 3913, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769074701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.535666) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14129796 bytes
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.536046) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 584.3 rd, 455.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.9, 12.4 +0.0 blob) out(13.5 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4444, records dropped: 531 output_compression: NoCompression
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.536060) EVENT_LOG_v1 {"time_micros": 1769074701536054, "job": 8, "event": "compaction_finished", "compaction_time_micros": 31034, "compaction_time_cpu_micros": 19706, "output_level": 6, "num_output_files": 1, "total_output_size": 14129796, "num_input_records": 4444, "num_output_records": 3913, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074701536745, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074701538079, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.504449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.538111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.538114) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.538116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.538117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:38:21.538118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:38:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:21 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118009a80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:21.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:22 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.12 deep-scrub starts
Jan 22 04:38:22 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.12 deep-scrub ok
Jan 22 04:38:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 22 04:38:22 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:38:22 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 22 04:38:22 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 22 04:38:22 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 22 04:38:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:22.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:23 np0005591762 python3.9[98648]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:38:23 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 22 04:38:23 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293571472s) [0] async=[0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 43'1161 active pruub 261.695831299s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:23 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:23 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 22 04:38:23 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 22 04:38:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:23 np0005591762 python3.9[98803]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 22 04:38:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:23 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:23.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:24 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 22 04:38:24 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967482567s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 active pruub 257.373352051s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:24 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:24 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 22 04:38:24 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 22 04:38:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5118009a80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 22 04:38:24 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 22 04:38:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 22 04:38:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:24 np0005591762 systemd[1]: session-38.scope: Deactivated successfully.
Jan 22 04:38:24 np0005591762 systemd[1]: session-38.scope: Consumed 12.869s CPU time.
Jan 22 04:38:24 np0005591762 systemd-logind[744]: Session 38 logged out. Waiting for processes to exit.
Jan 22 04:38:24 np0005591762 systemd-logind[744]: Removed session 38.
Jan 22 04:38:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:24.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5104000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:25 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 22 04:38:25 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 22 04:38:25 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 22 04:38:25 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:25 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:25 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:25.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:26 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 22 04:38:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 22 04:38:26 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 22 04:38:26 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:38:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8002fd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 22 04:38:27 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001114845s) [0] async=[0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 43'1161 active pruub 265.420257568s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:27 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:27 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Jan 22 04:38:27 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Jan 22 04:38:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:27 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51240023d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:27.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:28 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 22 04:38:28 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 22 04:38:28 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 22 04:38:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:28.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:29 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 22 04:38:29 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 22 04:38:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:29 np0005591762 systemd-logind[744]: New session 39 of user zuul.
Jan 22 04:38:29 np0005591762 systemd[1]: Started Session 39 of User zuul.
Jan 22 04:38:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:29 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8002fd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:29.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:30 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Jan 22 04:38:30 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Jan 22 04:38:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124002f80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:30 np0005591762 python3.9[99014]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:38:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:30.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:38:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124002f80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:31 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Jan 22 04:38:31 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Jan 22 04:38:31 np0005591762 python3.9[99170]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:38:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:31 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:38:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:31.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:38:32 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Jan 22 04:38:32 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Jan 22 04:38:32 np0005591762 python3.9[99363]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:38:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8002fd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 22 04:38:32 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 22 04:38:32 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 22 04:38:32 np0005591762 systemd[1]: session-39.scope: Deactivated successfully.
Jan 22 04:38:32 np0005591762 systemd[1]: session-39.scope: Consumed 1.642s CPU time.
Jan 22 04:38:32 np0005591762 systemd-logind[744]: Session 39 logged out. Waiting for processes to exit.
Jan 22 04:38:32 np0005591762 systemd-logind[744]: Removed session 39.
Jan 22 04:38:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:32.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:33 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 22 04:38:33 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 22 04:38:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:33 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 22 04:38:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:33 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124002f80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:33.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Jan 22 04:38:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Jan 22 04:38:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 22 04:38:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:34 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 22 04:38:34 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 22 04:38:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:38:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:34.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:38:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:35 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Jan 22 04:38:35 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Jan 22 04:38:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 22 04:38:35 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:35 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 04:38:35 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 22 04:38:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:35 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:35.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:36 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 22 04:38:36 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 22 04:38:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124003100 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 22 04:38:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:36.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124003100 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:37 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Jan 22 04:38:37 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Jan 22 04:38:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 22 04:38:37 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 04:38:37 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 04:38:37 np0005591762 systemd-logind[744]: New session 40 of user zuul.
Jan 22 04:38:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:37 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:37 np0005591762 systemd[1]: Started Session 40 of User zuul.
Jan 22 04:38:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:38 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 22 04:38:38 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 22 04:38:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:38 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:38 np0005591762 python3.9[99548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 22 04:38:38 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 04:38:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:38:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:38.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:38:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:39 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8002fd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:39 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 22 04:38:39 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 22 04:38:39 np0005591762 python3.9[99703]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:39 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51240057f0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:38:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:39.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:38:40 np0005591762 python3.9[99860]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:38:40 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 22 04:38:40 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 22 04:38:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:40 np0005591762 python3.9[99945]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:38:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:40.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:38:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:41 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:41 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 22 04:38:41 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 22 04:38:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:41 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:41.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:42 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 22 04:38:42 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 22 04:38:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:42 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51240057f0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:42 np0005591762 python3.9[100099]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:38:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 22 04:38:42 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 22 04:38:42 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 22 04:38:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:42.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:43 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 22 04:38:43 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.b scrub starts
Jan 22 04:38:43 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.b scrub ok
Jan 22 04:38:43 np0005591762 python3.9[100296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:38:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 22 04:38:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:43 np0005591762 python3.9[100448]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:38:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:43.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:44 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 22 04:38:44 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 22 04:38:44 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 22 04:38:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:44 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80040d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:44 np0005591762 python3.9[100609]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:38:44 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:38:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 22 04:38:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 22 04:38:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:44.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:44 np0005591762 python3.9[100688]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:38:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:45 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51240057f0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:45 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 22 04:38:45 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 22 04:38:45 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 22 04:38:45 np0005591762 python3.9[100841]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:38:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:45 np0005591762 python3.9[100919]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:38:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:45 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:45.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:46 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.13 deep-scrub starts
Jan 22 04:38:46 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.13 deep-scrub ok
Jan 22 04:38:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 22 04:38:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:46 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:46 np0005591762 python3.9[101071]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:38:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:46.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:46 np0005591762 python3.9[101224]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:38:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:47 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 22 04:38:47 np0005591762 python3.9[101377]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:38:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:47 np0005591762 python3.9[101529]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:38:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:47 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124006110 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:38:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:47.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:38:48 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 22 04:38:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:48 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:48 np0005591762 python3.9[101682]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:48.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:49 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:49 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:49.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:50 np0005591762 python3.9[101862]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:38:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:50 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124006110 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:50 np0005591762 python3.9[102017]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:38:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:50.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:51 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:51 np0005591762 python3.9[102170]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:38:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:51 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:51 np0005591762 python3.9[102322]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:38:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:38:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:51.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:38:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:52 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80040d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:52 np0005591762 python3.9[102475]: ansible-service_facts Invoked
Jan 22 04:38:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:52 np0005591762 network[102493]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:38:52 np0005591762 network[102494]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:38:52 np0005591762 network[102495]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:38:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:38:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:52.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:38:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:53 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124006110 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:53 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c390 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:53.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:54 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:54.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:55 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80040d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:55 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5124006110 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:55 np0005591762 python3.9[103007]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:38:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:55.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:56 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c780 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:56.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:38:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:57 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50fc00c780 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:57 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:38:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:58 np0005591762 python3.9[103187]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 22 04:38:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:58 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120000df0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:38:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:38:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:38:58.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:38:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:38:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:38:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:38:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:38:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:38:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:38:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:59 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128004aa0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:38:59 np0005591762 python3.9[103341]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:38:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:38:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:38:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:38:59 np0005591762 python3.9[103419]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:38:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:38:59 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:39:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:00.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:39:00 np0005591762 python3.9[103571]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:00 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:00 np0005591762 python3.9[103649]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:00.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:01 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:01 np0005591762 python3.9[103803]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:01 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51280055a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:39:01 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:39:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:02.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:02 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:02.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:03 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:03 np0005591762 python3.9[103981]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:39:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:03 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:03 np0005591762 python3.9[104066]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:39:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:39:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:04.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:39:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:04 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51280055a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:04 np0005591762 systemd[1]: session-40.scope: Deactivated successfully.
Jan 22 04:39:04 np0005591762 systemd[1]: session-40.scope: Consumed 16.652s CPU time.
Jan 22 04:39:04 np0005591762 systemd-logind[744]: Session 40 logged out. Waiting for processes to exit.
Jan 22 04:39:04 np0005591762 systemd-logind[744]: Removed session 40.
Jan 22 04:39:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:04.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:05 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:05 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:39:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:06.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:39:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:06 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51200019c0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:06.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:07 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51280055a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:07 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f511c0c5bf0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:08.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:08 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:39:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:08.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:39:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:09 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120003b80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:09 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128006ae0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:10.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:10 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128006ae0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:10 np0005591762 systemd-logind[744]: New session 41 of user zuul.
Jan 22 04:39:10 np0005591762 systemd[1]: Started Session 41 of User zuul.
Jan 22 04:39:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:10.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:10 np0005591762 python3.9[104280]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:11 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128006ae0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:11 np0005591762 python3.9[104433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:11 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128006ae0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:11 np0005591762 python3.9[104511]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:12.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:12 np0005591762 systemd[1]: session-41.scope: Deactivated successfully.
Jan 22 04:39:12 np0005591762 systemd[1]: session-41.scope: Consumed 1.074s CPU time.
Jan 22 04:39:12 np0005591762 systemd-logind[744]: Session 41 logged out. Waiting for processes to exit.
Jan 22 04:39:12 np0005591762 systemd-logind[744]: Removed session 41.
Jan 22 04:39:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:12 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f80040d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:12.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:13 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128006ae0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:13 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120003b80 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:39:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:14.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:39:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:14 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:14.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:15 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:15 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128006ae0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:16.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:16 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:16.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:17 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:17 np0005591762 systemd-logind[744]: New session 42 of user zuul.
Jan 22 04:39:17 np0005591762 systemd[1]: Started Session 42 of User zuul.
Jan 22 04:39:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:17 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:18 np0005591762 python3.9[104695]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:39:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:18.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:18 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128007be0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:18.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:18 np0005591762 python3.9[104852]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:19 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:19 np0005591762 python3.9[105028]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:19 np0005591762 python3.9[105106]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.7teu3f1r recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:19 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:39:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:20.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:39:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:20 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:20 np0005591762 python3.9[105258]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:20.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:20 np0005591762 python3.9[105337]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.65lut2w4 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:21 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:21 np0005591762 python3.9[105490]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:39:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:21 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:21 np0005591762 python3.9[105642]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000008s ======
Jan 22 04:39:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:22.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000008s
Jan 22 04:39:22 np0005591762 python3.9[105720]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:39:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:22 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:22.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:23 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128007be0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:23 np0005591762 python3.9[105873]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:23 np0005591762 python3.9[105952]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:39:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:23 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:24.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:24 np0005591762 python3.9[106104]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:24 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:24.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:24 np0005591762 python3.9[106257]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:25 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:25 np0005591762 python3.9[106335]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:25 np0005591762 python3.9[106488]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:25 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128007be0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:26.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:26 np0005591762 python3.9[106566]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:26 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:26.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:26 np0005591762 python3.9[106719]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:39:26 np0005591762 systemd[1]: Reloading.
Jan 22 04:39:27 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:39:27 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:39:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:27 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:27 np0005591762 systemd[1]: session-18.scope: Deactivated successfully.
Jan 22 04:39:27 np0005591762 systemd[1]: session-18.scope: Consumed 6.170s CPU time.
Jan 22 04:39:27 np0005591762 systemd-logind[744]: Session 18 logged out. Waiting for processes to exit.
Jan 22 04:39:27 np0005591762 systemd-logind[744]: Removed session 18.
Jan 22 04:39:27 np0005591762 python3.9[106910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:27 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:28.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:28 np0005591762 python3.9[106988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:28 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5128007be0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:28 np0005591762 python3.9[107141]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:28.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:28 np0005591762 python3.9[107220]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:29 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:29 np0005591762 python3.9[107374]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:39:29 np0005591762 systemd[1]: Reloading.
Jan 22 04:39:29 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:39:29 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:39:29 np0005591762 systemd[1]: Starting Create netns directory...
Jan 22 04:39:29 np0005591762 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 04:39:29 np0005591762 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 04:39:29 np0005591762 systemd[1]: Finished Create netns directory.
Jan 22 04:39:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:29 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:30.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:30 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:30 np0005591762 python3.9[107591]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:39:30 np0005591762 network[107609]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:39:30 np0005591762 network[107610]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:39:30 np0005591762 network[107611]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:39:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:30.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:31 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f512c0b0880 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:31 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:32.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:32 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:39:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:32.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:39:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:33 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:33 np0005591762 python3.9[107876]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:33 np0005591762 python3.9[107954]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:33 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:34 np0005591762 python3.9[108106]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:34 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:34 np0005591762 python3.9[108259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:34.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:35 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:35 np0005591762 python3.9[108337]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:35 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:36.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:36 np0005591762 python3.9[108490]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 04:39:36 np0005591762 systemd[1]: Starting Time & Date Service...
Jan 22 04:39:36 np0005591762 systemd[1]: Started Time & Date Service.
Jan 22 04:39:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:36 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:36.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:36 np0005591762 python3.9[108647]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:37 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:37 np0005591762 python3.9[108800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:37 np0005591762 python3.9[108878]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:37 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:38 np0005591762 python3.9[109030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:38 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:38 np0005591762 python3.9[109108]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.4_k3obpr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:38.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:38 np0005591762 python3.9[109261]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:39 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:39 np0005591762 python3.9[109340]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:39 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5138029400 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:40.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:40 np0005591762 python3.9[109492]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:39:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:40 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:40.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:40 np0005591762 python3[109646]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 04:39:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:41 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:41 np0005591762 python3.9[109799]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:41 np0005591762 python3.9[109877]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:41 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:42.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:42 np0005591762 python3.9[110029]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:42 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5138029400 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:42 np0005591762 python3.9[110155]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074781.9625804-897-186458798718676/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:42.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:43 np0005591762 python3.9[110308]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:43 np0005591762 python3.9[110386]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:43 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:44.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:44 np0005591762 python3.9[110538]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:44 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:44 np0005591762 python3.9[110616]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:44.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:45 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5138029400 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:45 np0005591762 python3.9[110769]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:45 np0005591762 python3.9[110848]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:45 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:46.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:46 np0005591762 python3.9[111000]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:39:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:46 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:46 np0005591762 python3.9[111156]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:39:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:46.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:39:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:47 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:47 np0005591762 python3.9[111309]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:47 np0005591762 python3.9[111461]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:47 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5138029400 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:48.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:48 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:48 np0005591762 python3.9[111613]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 04:39:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:48.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:49 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:49 np0005591762 python3.9[111766]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 04:39:49 np0005591762 systemd[1]: session-42.scope: Deactivated successfully.
Jan 22 04:39:49 np0005591762 systemd[1]: session-42.scope: Consumed 20.246s CPU time.
Jan 22 04:39:49 np0005591762 systemd-logind[744]: Session 42 logged out. Waiting for processes to exit.
Jan 22 04:39:49 np0005591762 systemd-logind[744]: Removed session 42.
Jan 22 04:39:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:49 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5120004890 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:50.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:50 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5138029400 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:50.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:51 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f50f8004de0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:51 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:52.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:52 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:39:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/093952 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:39:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:52.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:53 np0005591762 kernel: ganesha.nfsd[103031]: segfault at 50 ip 00007f5187b6532e sp 00007f510cff8210 error 4 in libntirpc.so.5.8[7f5187b4a000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 22 04:39:53 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:39:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[87135]: 22/01/2026 09:39:53 : epoch 6971ef80 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f51040089d0 fd 49 proxy ignored for local
Jan 22 04:39:53 np0005591762 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 22 04:39:53 np0005591762 systemd[1]: Started Process Core Dump (PID 111821/UID 0).
Jan 22 04:39:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:54.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:54 np0005591762 systemd-coredump[111823]: Process 87139 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 67:#012#0  0x00007f5187b6532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f5187b6f900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 22 04:39:54 np0005591762 systemd[1]: systemd-coredump@0-111821-0.service: Deactivated successfully.
Jan 22 04:39:54 np0005591762 systemd[1]: systemd-coredump@0-111821-0.service: Consumed 1.021s CPU time.
Jan 22 04:39:54 np0005591762 podman[111828]: 2026-01-22 09:39:54.17847139 +0000 UTC m=+0.018016054 container died 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:39:54 np0005591762 systemd[1]: var-lib-containers-storage-overlay-775b89649c08bc74313081ba8794beae129ab390be1891918b3bb31aca319fa0-merged.mount: Deactivated successfully.
Jan 22 04:39:54 np0005591762 systemd[82971]: Created slice User Background Tasks Slice.
Jan 22 04:39:54 np0005591762 systemd[82971]: Starting Cleanup of User's Temporary Files and Directories...
Jan 22 04:39:54 np0005591762 podman[111828]: 2026-01-22 09:39:54.19866552 +0000 UTC m=+0.038210163 container remove 46dd2f89ced7c6478c77c51cb943d25367f5e75c81257e0d7816a9c9218a28ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 22 04:39:54 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:39:54 np0005591762 systemd[82971]: Finished Cleanup of User's Temporary Files and Directories.
Jan 22 04:39:54 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:39:54 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.203s CPU time.
Jan 22 04:39:54 np0005591762 systemd-logind[744]: New session 43 of user zuul.
Jan 22 04:39:54 np0005591762 systemd[1]: Started Session 43 of User zuul.
Jan 22 04:39:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:54.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:55 np0005591762 python3.9[112018]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 22 04:39:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:55 np0005591762 python3.9[112171]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:39:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:56.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:56 np0005591762 python3.9[112325]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 22 04:39:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:56 np0005591762 python3.9[112478]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.lwyqerou follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:39:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:39:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:56.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:57 np0005591762 python3.9[112604]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.lwyqerou mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074796.3996825-104-182962677618717/.source.lwyqerou _original_basename=.4ff_pyg8 follow=False checksum=29616629aac123748dd219790bea456c41d2072c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:39:58.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:58 np0005591762 python3.9[112756]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:39:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:58 np0005591762 python3.9[112909]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1l/4Hnab8cJ+0NgZRyND+668QQ18xCAMiTa4tJfwkacqv2+xu0AP833wzvRbj+BSz/GJYAjYZHtl/LPY/fgAiwZLhNui+6RFQXnMI+TWlUgadcYlxCFSLNXdeIU4VHKdxnYN8cw8WtM+PFaCdmFRk0NGTRLladuZ2Ft6qgEk/ocZCZ1hweLpc0NBPMupsV5ABFtNEZPBg5lEqxBdbFOY3MxlYJEKWIsWCyxu9jzoxc8ct4ejcM8FVx9pujC2XCWVumSYrXkp9LnbeYCOlxnalYYTgZWNh3ilMYw3g85DVUyF1ZECfbN4/uuu9emfUiC8EmIRofJTX7/IPDpqM0CgSFHt6gq45OgfrZ+YHcpPg8Bq5JWL3rpkIoZDiidmCCGrtku8huN9VGYcahOdJVixsNrfIS2jx9k86e19gNzUSKc3qxM6HCUrH0yEbXwcOcG6b1EcBllpJsHB3uXZNar6PeI2C+BkUQH/0520RqM7Zb0ZEg4+6S6i+Z11Ddhkn+Sk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEZOEP9uQiV1zH3a3aHqfWGEuJqzUo4rClu3BLMlWitr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNySjQgocwwOdUR7+1+vff+WJ7HHi2x7SZejx49o87M82KSvvvJ1bXTTeQ2yV4jf9DSKuJ6HcIHDr6bnAXEDEj8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDx6FoZ1mQHUkExUKBX3RXUJtaZVmdK+/kJ75+oWOFtIZlx0mZcdVNn/rW0Q++oQhtNRWXFfZrC6xkhCT1INz4AehTVQ2y9DTa6PxylfZKv4SS0yNLP/UkFFMiKtWgxzfnFYniRmVr6pgKNAsIxOlGQHtYY9MzvNCU0rfxVJQV1DM7am+c3mbsqlU0w7R+Tur5zDSLFdysQdDqAk4UqlqkgYagUBOhC/cnkuUNOyj3idOKJhFrz/mnkO3P/KrXcgMPfFtu+yx5rQNDNyoZV1bp+uPgP8kvQGe5ol/cbTEiXlZ5BEgYcKbky8H1ICbcoiG5YcmEMNOm8s88fxvf6dJpdeAmjmraoHZtKson2jeZ7NsYgsjNhwKEElcxzAfhnhK+IfalpZhHQxGypR/IPlQrLlJOrbyAEIyk40nASUHxlJrOXP1lA9dvLaG/3KkIa2sPwaIgdVhzpmyodJds2sMg6cngRljDGY1UBTYGyo8vNNILFoCzMPNDcNCyY9xWYz8M=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICDQuz7VE0tTRnQJ96QrHIwmJh8osJY9A2+gmzkUlh54#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM7hnQz957+RtY0Mltzkw+lJRI4x2IlQwAuVKb+t24lorNdYqOmeiT8j8X9huVxPKGZSUxesKQ7YFrI9bxqNRo4=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLkHp1/Qvor0RkXO+PvZvnJssDpVN93zM11quNN8iQ4KKQf8UHuKy+z84HXpOkzuxv1FNmR50SFPdR2h52T9/BEP+zzSmYli9cDaisI9zLQpghAnG+lXYjqsiPIXqR2z4IheTXQWRoc0c/9XzYCUMaMD73LVsv2ZTHG2Y7QfvK4MxYDPfGzTPihT0BaumTQQi1aKi5eILvXezyBhIgOrgWXDy73LvUS0A1PnwBTWjez2dmfEl2SozhpeqVRSmWdCZ8dRtXREfB6Mq/AC0SFrdQRYBB1fp6IKFrJhehXq8uN9YGQim7NDv95g1Vbg09hBzVMVRBut+meLFMgQicOFxX4cOH/zmBq2HZZ4NgoXQIttG2MWvRDeeOArcoiR4trg88CvXIKbHm7X3Xz124i1la6Znzd233vMLjW61sfm2BSiRvi2U199hCeHLpCKZDeXEfNKKws4/PCyJpilTrDhy01w/oqI6uKjCvuEpfNoDSqx4gfjAyjJboFWEV2ArMddk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAzNyDe1tBrOdz2+WL/pj9pc2M51PHCPiPpvoZYn4bHE#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBELYxft8jWfz1ywTUaPBtZwChEDFG53eKlkYcIDxgJP7KVnKVHGrkh7LMAVvlpn5gDq4gHPOx2/pvsvKR+u3AfU=#012 create=True mode=0644 path=/tmp/ansible.lwyqerou state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:39:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:39:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:39:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:39:58.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:39:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/093959 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:39:59 np0005591762 python3.9[113062]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.lwyqerou' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:39:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:39:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:39:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:39:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:00.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:00 np0005591762 python3.9[113216]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.lwyqerou state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:00 np0005591762 ceph-mon[75519]: overall HEALTH_OK
Jan 22 04:40:00 np0005591762 systemd[1]: session-43.scope: Deactivated successfully.
Jan 22 04:40:00 np0005591762 systemd[1]: session-43.scope: Consumed 3.481s CPU time.
Jan 22 04:40:00 np0005591762 systemd-logind[744]: Session 43 logged out. Waiting for processes to exit.
Jan 22 04:40:00 np0005591762 systemd-logind[744]: Removed session 43.
Jan 22 04:40:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:00.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:02.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:02.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:04.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:04 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 1.
Jan 22 04:40:04 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:40:04 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.203s CPU time.
Jan 22 04:40:04 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:04 np0005591762 podman[113361]: 2026-01-22 09:40:04.660933481 +0000 UTC m=+0.029244972 container create 7a8b9569f5cd8b941e006967d43735778681f96a3273ee6fe808029e0657f6ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 22 04:40:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6d866fbb99b7b7cc3a52d78579a461c5147b447cdc35c8b1b10baad6f0f5e40/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6d866fbb99b7b7cc3a52d78579a461c5147b447cdc35c8b1b10baad6f0f5e40/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6d866fbb99b7b7cc3a52d78579a461c5147b447cdc35c8b1b10baad6f0f5e40/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6d866fbb99b7b7cc3a52d78579a461c5147b447cdc35c8b1b10baad6f0f5e40/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:04 np0005591762 podman[113361]: 2026-01-22 09:40:04.704627379 +0000 UTC m=+0.072938880 container init 7a8b9569f5cd8b941e006967d43735778681f96a3273ee6fe808029e0657f6ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True)
Jan 22 04:40:04 np0005591762 podman[113361]: 2026-01-22 09:40:04.70960377 +0000 UTC m=+0.077915261 container start 7a8b9569f5cd8b941e006967d43735778681f96a3273ee6fe808029e0657f6ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:40:04 np0005591762 bash[113361]: 7a8b9569f5cd8b941e006967d43735778681f96a3273ee6fe808029e0657f6ff
Jan 22 04:40:04 np0005591762 podman[113361]: 2026-01-22 09:40:04.649526747 +0000 UTC m=+0.017838239 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:40:04 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:40:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:04 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:40:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:04.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:40:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:40:05 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:40:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:40:05 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:40:05 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:40:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:05 np0005591762 systemd-logind[744]: New session 44 of user zuul.
Jan 22 04:40:05 np0005591762 systemd[1]: Started Session 44 of User zuul.
Jan 22 04:40:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:06.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:06 np0005591762 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 04:40:06 np0005591762 python3.9[113569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:40:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:06.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:07 np0005591762 python3.9[113729]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 04:40:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:08.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:08 np0005591762 python3.9[113883]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:40:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:08 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:40:08 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:40:08 np0005591762 python3.9[114062]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:40:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:08.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:09 np0005591762 python3.9[114216]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:40:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:10.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:10 np0005591762 python3.9[114393]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:10 np0005591762 systemd[1]: session-44.scope: Deactivated successfully.
Jan 22 04:40:10 np0005591762 systemd[1]: session-44.scope: Consumed 2.664s CPU time.
Jan 22 04:40:10 np0005591762 systemd-logind[744]: Session 44 logged out. Waiting for processes to exit.
Jan 22 04:40:10 np0005591762 systemd-logind[744]: Removed session 44.
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:40:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:10 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:40:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:10.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:12.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:12.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:14.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094014 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:40:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:14.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:15 np0005591762 systemd-logind[744]: New session 45 of user zuul.
Jan 22 04:40:15 np0005591762 systemd[1]: Started Session 45 of User zuul.
Jan 22 04:40:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:16.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:16 np0005591762 python3.9[114577]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000007:nfs.cephfs.1: -2
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:40:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:16 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:40:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:16.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:17 np0005591762 kernel: ganesha.nfsd[114664]: segfault at 50 ip 00007fc66c6de32e sp 00007fc5e9ffa210 error 4 in libntirpc.so.5.8[7fc66c6c3000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 22 04:40:17 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:40:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[113373]: 22/01/2026 09:40:17 : epoch 6971f074 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5e4000df0 fd 38 proxy ignored for local
Jan 22 04:40:17 np0005591762 systemd[1]: Started Process Core Dump (PID 114751/UID 0).
Jan 22 04:40:17 np0005591762 python3.9[114747]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:40:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:17 np0005591762 python3.9[114836]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 04:40:18 np0005591762 systemd-coredump[114752]: Process 113377 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007fc66c6de32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:40:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:18.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:18 np0005591762 systemd[1]: systemd-coredump@1-114751-0.service: Deactivated successfully.
Jan 22 04:40:18 np0005591762 systemd[1]: systemd-coredump@1-114751-0.service: Consumed 1.016s CPU time.
Jan 22 04:40:18 np0005591762 podman[114842]: 2026-01-22 09:40:18.184404444 +0000 UTC m=+0.018713217 container died 7a8b9569f5cd8b941e006967d43735778681f96a3273ee6fe808029e0657f6ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 04:40:18 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c6d866fbb99b7b7cc3a52d78579a461c5147b447cdc35c8b1b10baad6f0f5e40-merged.mount: Deactivated successfully.
Jan 22 04:40:18 np0005591762 podman[114842]: 2026-01-22 09:40:18.203189355 +0000 UTC m=+0.037498129 container remove 7a8b9569f5cd8b941e006967d43735778681f96a3273ee6fe808029e0657f6ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:40:18 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:40:18 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:40:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:18.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:19 np0005591762 python3.9[115026]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:40:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:20.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:20 np0005591762 python3.9[115177]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 04:40:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:20 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:40:20 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:40:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:20.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:21 np0005591762 python3.9[115329]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:40:21 np0005591762 python3.9[115480]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:40:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:21 np0005591762 systemd[1]: session-45.scope: Deactivated successfully.
Jan 22 04:40:21 np0005591762 systemd[1]: session-45.scope: Consumed 4.154s CPU time.
Jan 22 04:40:21 np0005591762 systemd-logind[744]: Session 45 logged out. Waiting for processes to exit.
Jan 22 04:40:21 np0005591762 systemd-logind[744]: Removed session 45.
Jan 22 04:40:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000009s ======
Jan 22 04:40:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:22.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Jan 22 04:40:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:22.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:24.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:24.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:26.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:26 np0005591762 systemd-logind[744]: New session 46 of user zuul.
Jan 22 04:40:26 np0005591762 systemd[1]: Started Session 46 of User zuul.
Jan 22 04:40:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:26.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:27 np0005591762 python3.9[115664]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:40:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:28.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:28 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 2.
Jan 22 04:40:28 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:40:28 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:28 np0005591762 podman[115860]: 2026-01-22 09:40:28.700330919 +0000 UTC m=+0.028195724 container create 525254b133567fe244e786ebb7725410a5a0a2f17b6a815296f95a19da17f1be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 22 04:40:28 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3011b9da04032974b963db6923ee2c9c262c6ccbbc5028c60160188f29cd5e/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:28 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3011b9da04032974b963db6923ee2c9c262c6ccbbc5028c60160188f29cd5e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:28 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3011b9da04032974b963db6923ee2c9c262c6ccbbc5028c60160188f29cd5e/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:28 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3011b9da04032974b963db6923ee2c9c262c6ccbbc5028c60160188f29cd5e/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:28 np0005591762 podman[115860]: 2026-01-22 09:40:28.743933165 +0000 UTC m=+0.071797990 container init 525254b133567fe244e786ebb7725410a5a0a2f17b6a815296f95a19da17f1be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 22 04:40:28 np0005591762 podman[115860]: 2026-01-22 09:40:28.748881453 +0000 UTC m=+0.076746258 container start 525254b133567fe244e786ebb7725410a5a0a2f17b6a815296f95a19da17f1be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:40:28 np0005591762 bash[115860]: 525254b133567fe244e786ebb7725410a5a0a2f17b6a815296f95a19da17f1be
Jan 22 04:40:28 np0005591762 podman[115860]: 2026-01-22 09:40:28.688005804 +0000 UTC m=+0.015870629 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:40:28 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:40:28 np0005591762 python3.9[115839]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:40:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:28 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:40:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:28.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:29 np0005591762 python3.9[116066]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:30 np0005591762 python3.9[116218]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:30.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:30 np0005591762 python3.9[116366]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074829.6643362-156-41228841773829/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=c6832de8ea6559e5d6b796d053edff3c5bbd4ac9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:30.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:30 np0005591762 python3.9[116519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:31 np0005591762 python3.9[116643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074830.56177-156-88123669027823/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=b941eed241a5b99f9369a04b2b65d73a34d75e07 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:31 np0005591762 python3.9[116795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:32.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:32 np0005591762 python3.9[116918]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074831.4595408-156-217102969248267/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=8af57ca22791a51a291afbc9639d27017922a505 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:32 np0005591762 python3.9[117071]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000019s ======
Jan 22 04:40:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:32.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000019s
Jan 22 04:40:33 np0005591762 python3.9[117224]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:33 np0005591762 python3.9[117376]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:34.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:34 np0005591762 python3.9[117499]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074833.5253618-343-17042646093114/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=c69fde73c28581c844f14a72990009ae2e9dd6ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:34 np0005591762 python3.9[117652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:34 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:40:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:34 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:40:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:34.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:35 np0005591762 python3.9[117775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074834.3852086-343-95563258315103/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=3dfb7d65ba619ae3fdfdd05ac78d95a034b5ef3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:35 np0005591762 python3.9[117928]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:35 np0005591762 python3.9[118051]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074835.1801083-343-246919171787699/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=bb40eabf16cc3ddc76518c34c1f7017d7b088362 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:36.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:36 np0005591762 python3.9[118203]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:36 np0005591762 python3.9[118356]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:36.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:37 np0005591762 python3.9[118509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:37 np0005591762 python3.9[118632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074837.017023-514-104489023154860/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=c514ab4990d128bf1e2deef751e27452df228b29 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:38 np0005591762 python3.9[118784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:38 np0005591762 python3.9[118907]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074837.8774397-514-165192767451701/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=3dfb7d65ba619ae3fdfdd05ac78d95a034b5ef3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:38.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:39 np0005591762 python3.9[119060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:39 np0005591762 python3.9[119184]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074838.7857654-514-13555815223453/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=5d5de2156825f3d44c7f96bbb204aed7172fe315 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094040 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:40:40 np0005591762 python3.9[119336]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:40:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:40 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:40:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:41 np0005591762 python3.9[119498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:41 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6484000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:40:41 np0005591762 python3.9[119628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074840.6742194-722-99622101625285/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:41 np0005591762 python3.9[119780]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:41 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x55fac656dc30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:40:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:42.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:42 np0005591762 python3.9[119932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:42 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6480001e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:40:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:42 np0005591762 python3.9[120056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074842.0436108-793-260699307653332/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:42.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[115872]: 22/01/2026 09:40:43 : epoch 6971f08c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6480001e90 fd 39 proxy ignored for local
Jan 22 04:40:43 np0005591762 kernel: ganesha.nfsd[119496]: segfault at 50 ip 00007f650c6bf32e sp 00007f648cff8210 error 4 in libntirpc.so.5.8[7f650c6a4000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 22 04:40:43 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:40:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094043 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:40:43 np0005591762 systemd[1]: Started Process Core Dump (PID 120181/UID 0).
Jan 22 04:40:43 np0005591762 python3.9[120211]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:43 np0005591762 python3.9[120365]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:44 np0005591762 systemd-coredump[120182]: Process 115876 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 47:#012#0  0x00007f650c6bf32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:40:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:44.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:44 np0005591762 systemd[1]: systemd-coredump@2-120181-0.service: Deactivated successfully.
Jan 22 04:40:44 np0005591762 podman[120493]: 2026-01-22 09:40:44.171124451 +0000 UTC m=+0.020095123 container died 525254b133567fe244e786ebb7725410a5a0a2f17b6a815296f95a19da17f1be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:40:44 np0005591762 systemd[1]: var-lib-containers-storage-overlay-ca3011b9da04032974b963db6923ee2c9c262c6ccbbc5028c60160188f29cd5e-merged.mount: Deactivated successfully.
Jan 22 04:40:44 np0005591762 python3.9[120488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074843.4389043-865-164503401660943/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:44 np0005591762 podman[120493]: 2026-01-22 09:40:44.189947404 +0000 UTC m=+0.038918078 container remove 525254b133567fe244e786ebb7725410a5a0a2f17b6a815296f95a19da17f1be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True)
Jan 22 04:40:44 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:40:44 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:40:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:44 np0005591762 python3.9[120676]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:44.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:45 np0005591762 python3.9[120828]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:45 np0005591762 python3.9[120952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074844.8112562-937-126066206910947/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:46 np0005591762 python3.9[121104]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:46.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:46 np0005591762 python3.9[121256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:46 np0005591762 python3.9[121380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074846.1792223-1010-95682852500106/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:46.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:47 np0005591762 python3.9[121533]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:40:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:47 np0005591762 python3.9[121685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:48.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:48 np0005591762 python3.9[121808]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074847.50635-1081-167100757100009/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1d5973fd0d9f852bbc11b3ee817a5e73d7de1dd3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:48 np0005591762 systemd[1]: session-46.scope: Deactivated successfully.
Jan 22 04:40:48 np0005591762 systemd[1]: session-46.scope: Consumed 16.784s CPU time.
Jan 22 04:40:48 np0005591762 systemd-logind[744]: Session 46 logged out. Waiting for processes to exit.
Jan 22 04:40:48 np0005591762 systemd-logind[744]: Removed session 46.
Jan 22 04:40:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:48.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094049 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:40:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:50.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:50.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:52.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:52.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:53 np0005591762 systemd-logind[744]: New session 47 of user zuul.
Jan 22 04:40:54 np0005591762 systemd[1]: Started Session 47 of User zuul.
Jan 22 04:40:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:54.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:54 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 3.
Jan 22 04:40:54 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:40:54 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:40:54 np0005591762 podman[122061]: 2026-01-22 09:40:54.550096099 +0000 UTC m=+0.028297525 container create f42057facbad0f1f822e1e64862d8c3bc148484d7aa9cb508c2b6465c0b6b79f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7f41a712c8cf629cc0b2484acb799a9148183358a9b0e07d453e6b95b86543a/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7f41a712c8cf629cc0b2484acb799a9148183358a9b0e07d453e6b95b86543a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7f41a712c8cf629cc0b2484acb799a9148183358a9b0e07d453e6b95b86543a/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7f41a712c8cf629cc0b2484acb799a9148183358a9b0e07d453e6b95b86543a/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:40:54 np0005591762 podman[122061]: 2026-01-22 09:40:54.598774193 +0000 UTC m=+0.076975640 container init f42057facbad0f1f822e1e64862d8c3bc148484d7aa9cb508c2b6465c0b6b79f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 22 04:40:54 np0005591762 python3.9[122031]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:54 np0005591762 podman[122061]: 2026-01-22 09:40:54.60301116 +0000 UTC m=+0.081212577 container start f42057facbad0f1f822e1e64862d8c3bc148484d7aa9cb508c2b6465c0b6b79f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 04:40:54 np0005591762 bash[122061]: f42057facbad0f1f822e1e64862d8c3bc148484d7aa9cb508c2b6465c0b6b79f
Jan 22 04:40:54 np0005591762 podman[122061]: 2026-01-22 09:40:54.538911343 +0000 UTC m=+0.017112790 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:40:54 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:40:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:40:54 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:40:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:40:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:54.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:40:55 np0005591762 python3.9[122267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:55 np0005591762 python3.9[122390]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074854.7660124-59-125667225254963/.source.conf _original_basename=ceph.conf follow=False checksum=03d8d4124bbce310504894436c4a9612ab8c13f5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:56.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:56 np0005591762 python3.9[122542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:40:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:56 np0005591762 python3.9[122665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074855.8654883-59-271202773473545/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=6b7917605681093964532d08a385bc3f0474a26c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:40:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:40:56 np0005591762 systemd[1]: session-47.scope: Deactivated successfully.
Jan 22 04:40:56 np0005591762 systemd[1]: session-47.scope: Consumed 1.861s CPU time.
Jan 22 04:40:56 np0005591762 systemd-logind[744]: Session 47 logged out. Waiting for processes to exit.
Jan 22 04:40:56 np0005591762 systemd-logind[744]: Removed session 47.
Jan 22 04:40:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:56.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:40:58.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:40:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:40:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:40:58.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:40:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:40:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:40:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:40:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:41:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:00.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:41:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:00 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:41:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:02 np0005591762 systemd-logind[744]: New session 48 of user zuul.
Jan 22 04:41:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:02.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:02 np0005591762 systemd[1]: Started Session 48 of User zuul.
Jan 22 04:41:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:02.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:02 np0005591762 python3.9[122850]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:41:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:03 np0005591762 python3.9[123007]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:04.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:04 np0005591762 python3.9[123159]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094104 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:41:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:04 np0005591762 python3.9[123310]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:41:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:04.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:05 np0005591762 python3.9[123463]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 04:41:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:06.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000a:nfs.cephfs.1: -2
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:41:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:06 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:41:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:06.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:07 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc4000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:07 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 22 04:41:07 np0005591762 python3.9[123636]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:41:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:07 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:08.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:08 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:08 np0005591762 python3.9[123785]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:41:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:41:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:08.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:41:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094109 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:41:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:09 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002020 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:09 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:41:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:41:09 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:41:09 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:41:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:09 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0002880 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:10.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:10 np0005591762 python3.9[123979]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:41:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:10 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0002880 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:10.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:11 np0005591762 python3[124135]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 22 04:41:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:11 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:11 np0005591762 python3.9[124288]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:11 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:12.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:12 np0005591762 python3.9[124465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:12 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0003590 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094112 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:41:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:41:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:41:12 np0005591762 python3.9[124544]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:12.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:13 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0003590 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:13 np0005591762 python3.9[124697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:13 np0005591762 python3.9[124775]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.nqk9d89d recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:13 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:14 np0005591762 python3.9[124927]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:14.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:14 np0005591762 python3.9[125005]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:14 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:14.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:14 np0005591762 python3.9[125158]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:15 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0004030 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:15 np0005591762 python3[125312]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 04:41:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:15 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0004030 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:16.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:16 np0005591762 python3.9[125464]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:16 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80030f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:16 np0005591762 python3.9[125589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074875.8120277-429-230626048332696/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:16.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:17 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:17 np0005591762 python3.9[125742]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:17 np0005591762 python3.9[125868]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074876.8214703-473-238129701274310/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:17 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0004030 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:18.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:18 np0005591762 python3.9[126020]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:18 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0004030 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:18 np0005591762 python3.9[126145]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074877.7781246-519-272383211778963/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:18.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:19 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80030f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:19 np0005591762 python3.9[126298]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:19 np0005591762 python3.9[126424]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074878.7113078-564-179794575786852/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:19 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:20.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:20 np0005591762 python3.9[126576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:20 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0005520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:20 np0005591762 python3.9[126701]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769074879.756059-609-46666368115990/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:20 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:41:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:20.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:21 np0005591762 python3.9[126854]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:21 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0005520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:21 np0005591762 python3.9[127007]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:21 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:22 np0005591762 python3.9[127162]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:22.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:22 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:22 np0005591762 python3.9[127315]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:22.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:23 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0005520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:23 np0005591762 python3.9[127469]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:23 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:23 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:23 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:41:23 np0005591762 python3.9[127623]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:23 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0005520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:24.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:24 np0005591762 python3.9[127778]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:24 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:24.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:25 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:25 np0005591762 python3.9[127929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:41:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:25 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:26.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:26 np0005591762 python3.9[128083]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:26 np0005591762 ovs-vsctl[128084]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 22 04:41:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:26 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:26 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:41:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:26 np0005591762 python3.9[128237]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:26.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:27 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:27 np0005591762 python3.9[128393]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:27 np0005591762 ovs-vsctl[128394]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 22 04:41:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:27 np0005591762 python3.9[128544]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:41:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:27 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40050b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:28.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:28 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:28 np0005591762 python3.9[128698]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:41:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:28.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:41:29 np0005591762 python3.9[128851]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:29 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:29 np0005591762 python3.9[128930]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:29 np0005591762 python3.9[129082]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:29 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8004b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:30.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:30 np0005591762 python3.9[129160]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:30 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40050b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:30 np0005591762 python3.9[129338]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:30.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:31 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:31 np0005591762 python3.9[129490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:31 np0005591762 python3.9[129569]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:31 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:32 np0005591762 python3.9[129721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:32.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:32 np0005591762 python3.9[129799]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:32 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8004b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094132 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:41:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:32.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:33 np0005591762 python3.9[129952]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:41:33 np0005591762 systemd[1]: Reloading.
Jan 22 04:41:33 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:41:33 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:41:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:33 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40050b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:33 np0005591762 python3.9[130143]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:33 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40050b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.018256) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074894018286, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2597, "num_deletes": 251, "total_data_size": 6404287, "memory_usage": 6493704, "flush_reason": "Manual Compaction"}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074894026501, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4149939, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10650, "largest_seqno": 13242, "table_properties": {"data_size": 4139173, "index_size": 6812, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 24357, "raw_average_key_size": 21, "raw_value_size": 4116738, "raw_average_value_size": 3579, "num_data_blocks": 297, "num_entries": 1150, "num_filter_entries": 1150, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074702, "oldest_key_time": 1769074702, "file_creation_time": 1769074894, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8314 microseconds, and 6010 cpu microseconds.
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.026569) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4149939 bytes OK
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.026606) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.027199) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.027210) EVENT_LOG_v1 {"time_micros": 1769074894027206, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.027219) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6392469, prev total WAL file size 6392469, number of live WAL files 2.
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.028255) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4052KB)], [21(13MB)]
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074894028319, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 18279735, "oldest_snapshot_seqno": -1}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4543 keys, 15426130 bytes, temperature: kUnknown
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074894061966, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15426130, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15388784, "index_size": 24884, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 114651, "raw_average_key_size": 25, "raw_value_size": 15299010, "raw_average_value_size": 3367, "num_data_blocks": 1064, "num_entries": 4543, "num_filter_entries": 4543, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769074894, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.062210) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15426130 bytes
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.062657) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 541.1 rd, 456.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 13.5 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(8.1) write-amplify(3.7) OK, records in: 5063, records dropped: 520 output_compression: NoCompression
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.062680) EVENT_LOG_v1 {"time_micros": 1769074894062664, "job": 10, "event": "compaction_finished", "compaction_time_micros": 33785, "compaction_time_cpu_micros": 21840, "output_level": 6, "num_output_files": 1, "total_output_size": 15426130, "num_input_records": 5063, "num_output_records": 4543, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074894063475, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 22 04:41:34 np0005591762 python3.9[130221]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074894065249, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.028193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.065357) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.065361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.065363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.065364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:34 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:34.065365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:34.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:34 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:34 np0005591762 python3.9[130373]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:34 np0005591762 python3.9[130452]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:34.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:35 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8004b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:35 np0005591762 python3.9[130605]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:41:35 np0005591762 systemd[1]: Reloading.
Jan 22 04:41:35 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:41:35 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:41:35 np0005591762 systemd[1]: Starting Create netns directory...
Jan 22 04:41:35 np0005591762 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 04:41:35 np0005591762 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 04:41:35 np0005591762 systemd[1]: Finished Create netns directory.
Jan 22 04:41:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:35 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40050b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:36.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:36 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:41:36 np0005591762 python3.9[130798]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:36.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:37 np0005591762 python3.9[130951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:37 np0005591762 kernel: ganesha.nfsd[123499]: segfault at 50 ip 00007f5c4503d32e sp 00007f5bbdffa210 error 4 in libntirpc.so.5.8[7f5c45022000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 22 04:41:37 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:41:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[122073]: 22/01/2026 09:41:37 : epoch 6971f0a6 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb0006a10 fd 37 proxy ignored for local
Jan 22 04:41:37 np0005591762 systemd[1]: Started Process Core Dump (PID 130961/UID 0).
Jan 22 04:41:37 np0005591762 python3.9[131077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074896.7334294-1362-38720358412556/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:38 np0005591762 systemd-coredump[130977]: Process 122077 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 46:#012#0  0x00007f5c4503d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:41:38 np0005591762 systemd[1]: systemd-coredump@3-130961-0.service: Deactivated successfully.
Jan 22 04:41:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:38.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:38 np0005591762 podman[131234]: 2026-01-22 09:41:38.182858763 +0000 UTC m=+0.023725819 container died f42057facbad0f1f822e1e64862d8c3bc148484d7aa9cb508c2b6465c0b6b79f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 22 04:41:38 np0005591762 systemd[1]: var-lib-containers-storage-overlay-b7f41a712c8cf629cc0b2484acb799a9148183358a9b0e07d453e6b95b86543a-merged.mount: Deactivated successfully.
Jan 22 04:41:38 np0005591762 podman[131234]: 2026-01-22 09:41:38.201367113 +0000 UTC m=+0.042234159 container remove f42057facbad0f1f822e1e64862d8c3bc148484d7aa9cb508c2b6465c0b6b79f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:41:38 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:41:38 np0005591762 python3.9[131230]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:38 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:41:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:38 np0005591762 python3.9[131420]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:41:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:39 np0005591762 python3.9[131573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:39 np0005591762 python3.9[131696]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074899.1296494-1460-103959369037299/.source.json _original_basename=.wxnmak3z follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:40.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:40 np0005591762 python3.9[131846]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:40.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:42 np0005591762 python3.9[132271]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 22 04:41:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:42.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:42 np0005591762 python3.9[132424]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 04:41:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:42.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094143 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:41:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:43 np0005591762 python3[132577]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 04:41:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:44.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:44.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:46.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:46.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:48.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.241635) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074908241674, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 372, "num_deletes": 251, "total_data_size": 381504, "memory_usage": 388192, "flush_reason": "Manual Compaction"}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074908242861, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 235259, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13247, "largest_seqno": 13614, "table_properties": {"data_size": 233057, "index_size": 366, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5607, "raw_average_key_size": 19, "raw_value_size": 228769, "raw_average_value_size": 775, "num_data_blocks": 16, "num_entries": 295, "num_filter_entries": 295, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074895, "oldest_key_time": 1769074895, "file_creation_time": 1769074908, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 1249 microseconds, and 842 cpu microseconds.
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.242885) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 235259 bytes OK
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.242895) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.243213) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.243224) EVENT_LOG_v1 {"time_micros": 1769074908243221, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.243233) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 379033, prev total WAL file size 379033, number of live WAL files 2.
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.243545) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(229KB)], [24(14MB)]
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074908243643, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15661389, "oldest_snapshot_seqno": -1}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4329 keys, 12172483 bytes, temperature: kUnknown
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074908269153, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12172483, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12140404, "index_size": 20129, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 110735, "raw_average_key_size": 25, "raw_value_size": 12058134, "raw_average_value_size": 2785, "num_data_blocks": 858, "num_entries": 4329, "num_filter_entries": 4329, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769074908, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.269455) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12172483 bytes
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.275742) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 611.1 rd, 474.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 14.7 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(118.3) write-amplify(51.7) OK, records in: 4838, records dropped: 509 output_compression: NoCompression
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.275759) EVENT_LOG_v1 {"time_micros": 1769074908275752, "job": 12, "event": "compaction_finished", "compaction_time_micros": 25630, "compaction_time_cpu_micros": 19529, "output_level": 6, "num_output_files": 1, "total_output_size": 12172483, "num_input_records": 4838, "num_output_records": 4329, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074908275857, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769074908277650, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.243508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.277737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.277741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.277742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.277743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:48 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:41:48.277744) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:41:48 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 4.
Jan 22 04:41:48 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:41:48 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:41:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:48.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:49 np0005591762 podman[132588]: 2026-01-22 09:41:49.657763242 +0000 UTC m=+5.868890601 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 22 04:41:49 np0005591762 podman[132732]: 2026-01-22 09:41:49.73143992 +0000 UTC m=+0.034931336 container create afe0afb023a63d7e0b7f9bc5b6e24fe80b2cc6f5b44e1b52402d24e4a2736f4b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:41:49 np0005591762 podman[132750]: 2026-01-22 09:41:49.762572256 +0000 UTC m=+0.034406495 container create 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 04:41:49 np0005591762 podman[132750]: 2026-01-22 09:41:49.749118265 +0000 UTC m=+0.020952514 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 22 04:41:49 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f838b35c7a6b18cc84d254555158c8eb346ad6decd63a5d403eb66cae96acf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:41:49 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f838b35c7a6b18cc84d254555158c8eb346ad6decd63a5d403eb66cae96acf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:41:49 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f838b35c7a6b18cc84d254555158c8eb346ad6decd63a5d403eb66cae96acf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:41:49 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f838b35c7a6b18cc84d254555158c8eb346ad6decd63a5d403eb66cae96acf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:41:49 np0005591762 python3[132577]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 22 04:41:49 np0005591762 podman[132732]: 2026-01-22 09:41:49.783115356 +0000 UTC m=+0.086606792 container init afe0afb023a63d7e0b7f9bc5b6e24fe80b2cc6f5b44e1b52402d24e4a2736f4b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:41:49 np0005591762 podman[132732]: 2026-01-22 09:41:49.788285473 +0000 UTC m=+0.091776890 container start afe0afb023a63d7e0b7f9bc5b6e24fe80b2cc6f5b44e1b52402d24e4a2736f4b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 04:41:49 np0005591762 bash[132732]: afe0afb023a63d7e0b7f9bc5b6e24fe80b2cc6f5b44e1b52402d24e4a2736f4b
Jan 22 04:41:49 np0005591762 podman[132732]: 2026-01-22 09:41:49.71706414 +0000 UTC m=+0.020555575 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:41:49 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:41:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:41:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:50.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:50 np0005591762 python3.9[132998]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:41:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:50.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:50 np0005591762 python3.9[133153]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:51 np0005591762 python3.9[133230]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:41:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:52 np0005591762 python3.9[133381]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769074911.5439034-1694-9961580430537/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:52.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:52 np0005591762 python3.9[133457]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:41:52 np0005591762 systemd[1]: Reloading.
Jan 22 04:41:52 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:41:52 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:41:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:52.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:53 np0005591762 python3.9[133568]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:41:53 np0005591762 systemd[1]: Reloading.
Jan 22 04:41:53 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:41:53 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:41:53 np0005591762 systemd[1]: Starting ovn_controller container...
Jan 22 04:41:53 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:41:53 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6974f6f8625b4dccc971692e25c39d9d28dfa62345e555dca358cce60fdfa371/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 22 04:41:53 np0005591762 systemd[1]: Started /usr/bin/podman healthcheck run 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083.
Jan 22 04:41:53 np0005591762 podman[133610]: 2026-01-22 09:41:53.430081544 +0000 UTC m=+0.080639280 container init 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + sudo -E kolla_set_configs
Jan 22 04:41:53 np0005591762 podman[133610]: 2026-01-22 09:41:53.449775661 +0000 UTC m=+0.100333377 container start 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:41:53 np0005591762 edpm-start-podman-container[133610]: ovn_controller
Jan 22 04:41:53 np0005591762 systemd[1]: Created slice User Slice of UID 0.
Jan 22 04:41:53 np0005591762 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 22 04:41:53 np0005591762 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 22 04:41:53 np0005591762 systemd[1]: Starting User Manager for UID 0...
Jan 22 04:41:53 np0005591762 edpm-start-podman-container[133609]: Creating additional drop-in dependency for "ovn_controller" (03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083)
Jan 22 04:41:53 np0005591762 podman[133629]: 2026-01-22 09:41:53.509921144 +0000 UTC m=+0.052600855 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 04:41:53 np0005591762 systemd[1]: 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083-7226f9a2f4f08439.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 04:41:53 np0005591762 systemd[1]: 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083-7226f9a2f4f08439.service: Failed with result 'exit-code'.
Jan 22 04:41:53 np0005591762 systemd[1]: Reloading.
Jan 22 04:41:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:53 np0005591762 systemd[133652]: Queued start job for default target Main User Target.
Jan 22 04:41:53 np0005591762 systemd[133652]: Created slice User Application Slice.
Jan 22 04:41:53 np0005591762 systemd[133652]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 22 04:41:53 np0005591762 systemd[133652]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 04:41:53 np0005591762 systemd[133652]: Reached target Paths.
Jan 22 04:41:53 np0005591762 systemd[133652]: Reached target Timers.
Jan 22 04:41:53 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:41:53 np0005591762 systemd[133652]: Starting D-Bus User Message Bus Socket...
Jan 22 04:41:53 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:41:53 np0005591762 systemd[133652]: Starting Create User's Volatile Files and Directories...
Jan 22 04:41:53 np0005591762 systemd[133652]: Finished Create User's Volatile Files and Directories.
Jan 22 04:41:53 np0005591762 systemd[133652]: Listening on D-Bus User Message Bus Socket.
Jan 22 04:41:53 np0005591762 systemd[133652]: Reached target Sockets.
Jan 22 04:41:53 np0005591762 systemd[133652]: Reached target Basic System.
Jan 22 04:41:53 np0005591762 systemd[133652]: Reached target Main User Target.
Jan 22 04:41:53 np0005591762 systemd[133652]: Startup finished in 103ms.
Jan 22 04:41:53 np0005591762 systemd[1]: Started User Manager for UID 0.
Jan 22 04:41:53 np0005591762 systemd[1]: Started ovn_controller container.
Jan 22 04:41:53 np0005591762 systemd[1]: Started Session c1 of User root.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: INFO:__main__:Validating config file
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: INFO:__main__:Writing out command to execute
Jan 22 04:41:53 np0005591762 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: ++ cat /run_command
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + ARGS=
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + sudo kolla_copy_cacerts
Jan 22 04:41:53 np0005591762 systemd[1]: Started Session c2 of User root.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + [[ ! -n '' ]]
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + . kolla_extend_start
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + umask 0022
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 22 04:41:53 np0005591762 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8482] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8485] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <warn>  [1769074913.8487] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8491] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8493] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8497] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 04:41:53 np0005591762 kernel: br-int: entered promiscuous mode
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00019|main|INFO|OVS feature set changed, force recompute.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00020|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00022|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00023|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 04:41:53 np0005591762 ovn_controller[133622]: 2026-01-22T09:41:53Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8680] manager: (ovn-09ce5d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8685] manager: (ovn-e200ec-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 22 04:41:53 np0005591762 systemd-udevd[133751]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:41:53 np0005591762 kernel: genev_sys_6081: entered promiscuous mode
Jan 22 04:41:53 np0005591762 systemd-udevd[133753]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8785] device (genev_sys_6081): carrier: link connected
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.8787] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 22 04:41:53 np0005591762 NetworkManager[48910]: <info>  [1769074913.9916] manager: (ovn-eb0238-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 22 04:41:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:54.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:54 np0005591762 python3.9[133881]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 04:41:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:54.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:55 np0005591762 python3.9[134035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:41:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:55 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:41:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:41:55 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:41:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:41:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:56.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:41:56 np0005591762 python3.9[134158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074915.2389123-1829-168718815524198/.source.yaml _original_basename=.sw4uxlbf follow=False checksum=30769c46b1c73c629551b9176b18950dfb75be0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:41:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:56 np0005591762 python3.9[134311]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:56 np0005591762 ovs-vsctl[134312]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 22 04:41:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:41:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:56.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:57 np0005591762 python3.9[134465]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:57 np0005591762 ovs-vsctl[134467]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 22 04:41:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:58 np0005591762 python3.9[134620]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:41:58 np0005591762 ovs-vsctl[134621]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 22 04:41:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:41:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:41:58.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:41:58 np0005591762 systemd[1]: session-48.scope: Deactivated successfully.
Jan 22 04:41:58 np0005591762 systemd[1]: session-48.scope: Consumed 41.612s CPU time.
Jan 22 04:41:58 np0005591762 systemd-logind[744]: Session 48 logged out. Waiting for processes to exit.
Jan 22 04:41:58 np0005591762 systemd-logind[744]: Removed session 48.
Jan 22 04:41:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:41:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:41:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:41:58.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:41:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:41:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:41:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:41:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:00.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:00.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:42:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf8000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:02.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:02 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf8000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:02.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:03 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001e90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:03 np0005591762 systemd-logind[744]: New session 50 of user zuul.
Jan 22 04:42:03 np0005591762 systemd[1]: Started Session 50 of User zuul.
Jan 22 04:42:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:03 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:04 np0005591762 systemd[1]: Stopping User Manager for UID 0...
Jan 22 04:42:04 np0005591762 systemd[133652]: Activating special unit Exit the Session...
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped target Main User Target.
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped target Basic System.
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped target Paths.
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped target Sockets.
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped target Timers.
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 04:42:04 np0005591762 systemd[133652]: Closed D-Bus User Message Bus Socket.
Jan 22 04:42:04 np0005591762 systemd[133652]: Stopped Create User's Volatile Files and Directories.
Jan 22 04:42:04 np0005591762 systemd[133652]: Removed slice User Application Slice.
Jan 22 04:42:04 np0005591762 systemd[133652]: Reached target Shutdown.
Jan 22 04:42:04 np0005591762 systemd[133652]: Finished Exit the Session.
Jan 22 04:42:04 np0005591762 systemd[133652]: Reached target Exit the Session.
Jan 22 04:42:04 np0005591762 systemd[1]: user@0.service: Deactivated successfully.
Jan 22 04:42:04 np0005591762 systemd[1]: Stopped User Manager for UID 0.
Jan 22 04:42:04 np0005591762 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 22 04:42:04 np0005591762 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 22 04:42:04 np0005591762 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 22 04:42:04 np0005591762 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 22 04:42:04 np0005591762 systemd[1]: Removed slice User Slice of UID 0.
Jan 22 04:42:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:04.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:04 np0005591762 python3.9[134824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:42:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:04 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:04.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094205 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:42:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:05 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf8001dd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:05 np0005591762 python3.9[134982]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:05 np0005591762 python3.9[135134]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:05 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0002990 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:06 np0005591762 python3.9[135286]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:06.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:06 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc002de0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:06 np0005591762 python3.9[135438]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:06.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:07 np0005591762 python3.9[135591]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:07 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc002de0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:07 np0005591762 python3.9[135742]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:42:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:07 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf8001dd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:08.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:08 np0005591762 python3.9[135894]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 04:42:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:08 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0002990 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:42:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:08.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:42:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:09 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc003e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:09 np0005591762 python3.9[136046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:09 np0005591762 python3.9[136167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074928.8912003-215-67789499377582/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:09 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc003e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:10.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:10 np0005591762 python3.9[136317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:10 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf8001dd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094210 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:42:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:10 np0005591762 python3.9[136463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074929.931677-260-42545175715708/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:10.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:11 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00036a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:11 np0005591762 python3.9[136617]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:42:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:11 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc003e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:11 np0005591762 python3.9[136701]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:42:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:12.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:12 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc003e60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:12.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:13 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf80091b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:13 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:42:13 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:42:13 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:42:13 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:42:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:13 np0005591762 python3.9[136936]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:42:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:13 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00036a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:14.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:14 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005580 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:14 np0005591762 python3.9[137089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:14 np0005591762 python3.9[137211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074934.1711235-371-216894175085082/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:15.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:15 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005580 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:15 np0005591762 python3.9[137362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:15 np0005591762 python3.9[137483]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074935.0513759-371-190560365365927/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:15 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf80091b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:16.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:16 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00036a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:42:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:42:16 np0005591762 python3.9[137659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:17.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:17 np0005591762 python3.9[137780]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074936.4414697-504-188278850267082/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:17 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0063f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:17 np0005591762 python3.9[137931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:17 np0005591762 python3.9[138052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074937.216033-504-191897670735616/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:17 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0063f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:18.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:18 np0005591762 python3.9[138202]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:42:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:18 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf8009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:18 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:42:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:18 np0005591762 python3.9[138357]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:19.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:19 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00047a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:19 np0005591762 python3.9[138510]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:19 np0005591762 python3.9[138588]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:19 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0063f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:20 np0005591762 python3.9[138740]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:20.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:20 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0063f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:20 np0005591762 python3.9[138818]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:20 np0005591762 python3.9[138971]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:21.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:21 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0063f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:21 np0005591762 python3.9[139124]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:21 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:42:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:21 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:42:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:21 np0005591762 python3.9[139202]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:21 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00047a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:22.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:22 np0005591762 python3.9[139354]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:22 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a150 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:22 np0005591762 python3.9[139432]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:23.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:23 np0005591762 python3.9[139585]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:42:23 np0005591762 systemd[1]: Reloading.
Jan 22 04:42:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:23 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0063f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:23 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:42:23 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:42:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:23 np0005591762 ovn_controller[133622]: 2026-01-22T09:42:23Z|00025|memory|INFO|15872 kB peak resident set size after 29.9 seconds
Jan 22 04:42:23 np0005591762 ovn_controller[133622]: 2026-01-22T09:42:23Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 22 04:42:23 np0005591762 podman[139748]: 2026-01-22 09:42:23.720191622 +0000 UTC m=+0.066916199 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 04:42:23 np0005591762 python3.9[139791]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:23 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a150 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:24 np0005591762 python3.9[139876]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:24.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:24 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:24 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:42:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:24 np0005591762 python3.9[140028]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:24 np0005591762 python3.9[140107]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:25.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:25 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800ac50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:25 np0005591762 python3.9[140260]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:42:25 np0005591762 systemd[1]: Reloading.
Jan 22 04:42:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:25 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:42:25 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:42:25 np0005591762 systemd[1]: Starting Create netns directory...
Jan 22 04:42:25 np0005591762 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 04:42:25 np0005591762 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 04:42:25 np0005591762 systemd[1]: Finished Create netns directory.
Jan 22 04:42:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:25 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:26.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:26 np0005591762 python3.9[140453]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:26 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800ac50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:26 np0005591762 python3.9[140606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:27.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:27 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:27 np0005591762 python3.9[140729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769074946.4836078-956-66491494762602/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:27 np0005591762 python3.9[140882]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:27 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800ac50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:28.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:28 np0005591762 python3.9[141034]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:42:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:28 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:28 np0005591762 python3.9[141187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:29.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:29 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800ac50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:29 np0005591762 python3.9[141310]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074948.4891582-1055-129568064901569/.source.json _original_basename=.g1oppzfe follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:29 np0005591762 python3.9[141461]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:29 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:30.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:30 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800b960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094230 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:42:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:31.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:31 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:31 np0005591762 python3.9[141911]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 22 04:42:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:31 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf00050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:32 np0005591762 python3.9[142063]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 04:42:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:32.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:32 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800b960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:32 np0005591762 python3[142216]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 04:42:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:33.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:33 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800b960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:33 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:34.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:34 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:42:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:35.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:42:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:35 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:35 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:36.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:36 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:37.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:37 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:37 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:42:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:38.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:42:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:38 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:39.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:39 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:39 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:40.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:40 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:41.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:41 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:41 np0005591762 podman[142226]: 2026-01-22 09:42:41.841024693 +0000 UTC m=+8.961043303 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:42:41 np0005591762 podman[142340]: 2026-01-22 09:42:41.938788125 +0000 UTC m=+0.030176430 container create 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 04:42:41 np0005591762 podman[142340]: 2026-01-22 09:42:41.923998179 +0000 UTC m=+0.015386494 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:42:41 np0005591762 python3[142216]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:42:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:41 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800c670 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:42.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:42 np0005591762 python3.9[142520]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:42:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:42 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:43.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:43 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:43 np0005591762 python3.9[142676]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:43 np0005591762 python3.9[142752]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:42:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:43 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10004800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:44 np0005591762 python3.9[142903]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769074963.6549616-1289-36571499299067/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:44.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:44 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10004800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:44 np0005591762 python3.9[142979]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:42:44 np0005591762 systemd[1]: Reloading.
Jan 22 04:42:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:44 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:42:44 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:42:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:45.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:45 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:45 np0005591762 python3.9[143091]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:42:45 np0005591762 systemd[1]: Reloading.
Jan 22 04:42:45 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:42:45 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:42:45 np0005591762 systemd[1]: Starting ovn_metadata_agent container...
Jan 22 04:42:45 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:42:45 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e182a32a5761b6a8d8a95cc77691620f6fd4b0316d6eed4fa9139d8a1b93fbd9/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 22 04:42:45 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e182a32a5761b6a8d8a95cc77691620f6fd4b0316d6eed4fa9139d8a1b93fbd9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:42:45 np0005591762 systemd[1]: Started /usr/bin/podman healthcheck run 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a.
Jan 22 04:42:45 np0005591762 podman[143133]: 2026-01-22 09:42:45.564113986 +0000 UTC m=+0.072563331 container init 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 04:42:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + sudo -E kolla_set_configs
Jan 22 04:42:45 np0005591762 podman[143133]: 2026-01-22 09:42:45.588278855 +0000 UTC m=+0.096728191 container start 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:42:45 np0005591762 edpm-start-podman-container[143133]: ovn_metadata_agent
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Validating config file
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Copying service configuration files
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Writing out command to execute
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: ++ cat /run_command
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + CMD=neutron-ovn-metadata-agent
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + ARGS=
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + sudo kolla_copy_cacerts
Jan 22 04:42:45 np0005591762 edpm-start-podman-container[143132]: Creating additional drop-in dependency for "ovn_metadata_agent" (56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a)
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: Running command: 'neutron-ovn-metadata-agent'
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + [[ ! -n '' ]]
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + . kolla_extend_start
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + umask 0022
Jan 22 04:42:45 np0005591762 ovn_metadata_agent[143145]: + exec neutron-ovn-metadata-agent
Jan 22 04:42:45 np0005591762 systemd[1]: Reloading.
Jan 22 04:42:45 np0005591762 podman[143152]: 2026-01-22 09:42:45.6689569 +0000 UTC m=+0.072644754 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:42:45 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:42:45 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:42:45 np0005591762 systemd[1]: Started ovn_metadata_agent container.
Jan 22 04:42:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:45 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10004800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:46.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:46 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:46 np0005591762 python3.9[143377]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 04:42:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:42:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:47.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:42:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:47 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.144 143150 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.144 143150 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.144 143150 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.144 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.145 143150 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.146 143150 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.147 143150 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.148 143150 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.149 143150 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.150 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.151 143150 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.152 143150 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.153 143150 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.154 143150 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.155 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.156 143150 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.157 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.158 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.159 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.160 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.161 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.162 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.163 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.164 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.165 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.166 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.167 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.168 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.169 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.170 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.171 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.172 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.173 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.174 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.175 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.176 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.176 143150 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.176 143150 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.183 143150 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.183 143150 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.183 143150 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.183 143150 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.184 143150 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.194 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 61e0485d-79f8-4954-8f50-00743b2f8934 (UUID: 61e0485d-79f8-4954-8f50-00743b2f8934) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.209 143150 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.210 143150 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.210 143150 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.210 143150 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.212 143150 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.217 143150 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.221 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '61e0485d-79f8-4954-8f50-00743b2f8934'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], external_ids={}, name=61e0485d-79f8-4954-8f50-00743b2f8934, nb_cfg_timestamp=1769074921868, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.222 143150 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fe7b0d25af0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.222 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.222 143150 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.223 143150 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.223 143150 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.226 143150 DEBUG oslo_service.service [-] Started child 143404 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.229 143150 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp9abdoh47/privsep.sock']#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.229 143404 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-177287'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.245 143404 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.246 143404 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.246 143404 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.256 143404 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.262 143404 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.274 143404 INFO eventlet.wsgi.server [-] (143404) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 22 04:42:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:47 np0005591762 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 22 04:42:47 np0005591762 python3.9[143536]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.769 143150 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.769 143150 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9abdoh47/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.687 143537 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.690 143537 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.694 143537 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.694 143537 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143537#033[00m
Jan 22 04:42:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:47.771 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[6eeb2675-0d39-451f-a608-da42ba7997b5]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:42:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:47 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:48 np0005591762 python3.9[143666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769074967.3762305-1425-280041440657035/.source.yaml _original_basename=.a5v1umtu follow=False checksum=7bc24a5e53d8c45e39faf7b2fbdd2561f35405e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.202 143537 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.204 143537 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.204 143537 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:42:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:48.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:48 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.655 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3ce0ca-f728-49fa-9c97-b41535578741]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.657 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, column=external_ids, values=({'neutron:ovn-metadata-id': 'f74628ec-ac72-5917-bc48-d0b94308a0d0'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.662 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.666 143150 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.666 143150 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.667 143150 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.668 143150 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.669 143150 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.670 143150 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.671 143150 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.672 143150 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.673 143150 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.674 143150 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.675 143150 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.676 143150 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.677 143150 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.678 143150 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.679 143150 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.680 143150 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.681 143150 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.682 143150 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.683 143150 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.684 143150 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.685 143150 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.686 143150 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.687 143150 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.688 143150 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.689 143150 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.690 143150 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.691 143150 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.692 143150 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.693 143150 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.694 143150 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.695 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.696 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.697 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.698 143150 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.699 143150 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:42:48 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:42:48.699 143150 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 04:42:48 np0005591762 systemd-logind[744]: Session 50 logged out. Waiting for processes to exit.
Jan 22 04:42:48 np0005591762 systemd[1]: session-50.scope: Deactivated successfully.
Jan 22 04:42:48 np0005591762 systemd[1]: session-50.scope: Consumed 40.885s CPU time.
Jan 22 04:42:48 np0005591762 systemd-logind[744]: Removed session 50.
Jan 22 04:42:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:49.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:50 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:50.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:50 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10004800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:51.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:51 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc0078e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:52 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:52.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:52 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:53.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:53 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:53 np0005591762 systemd[1]: Started Session 51 of User zuul.
Jan 22 04:42:53 np0005591762 systemd-logind[744]: New session 51 of user zuul.
Jan 22 04:42:53 np0005591762 podman[143724]: 2026-01-22 09:42:53.813970433 +0000 UTC m=+0.058472511 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 04:42:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:54 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:54.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:54 np0005591762 python3.9[143899]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:42:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:54 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:55.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:55 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10005fb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:55 np0005591762 python3.9[144057]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:42:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:56 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:56.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:56 np0005591762 python3.9[144219]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:42:56 np0005591762 systemd[1]: Reloading.
Jan 22 04:42:56 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:42:56 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:42:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:56 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:42:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:42:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:57.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:42:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:57 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:57 np0005591762 python3.9[144406]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:42:57 np0005591762 network[144423]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:42:57 np0005591762 network[144424]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:42:57 np0005591762 network[144425]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:42:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:58 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:42:58.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:58 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:42:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:42:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:42:59.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:42:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:42:59 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:42:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:42:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:42:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:42:59 np0005591762 python3.9[144689]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:00 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c10005fb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:00.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:00 np0005591762 python3.9[144842]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:00 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:00 np0005591762 python3.9[144996]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:01 np0005591762 python3.9[145150]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:01 np0005591762 python3.9[145303]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:02 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:02.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:02 np0005591762 python3.9[145456]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:02 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:02 np0005591762 python3.9[145610]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:43:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:43:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:03.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:43:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:03 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:03 np0005591762 python3.9[145764]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:04 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:04.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:04 np0005591762 python3.9[145916]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:04 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:04 np0005591762 python3.9[146068]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:05.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:05 np0005591762 python3.9[146221]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:05 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:05 np0005591762 python3.9[146374]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:05 np0005591762 python3.9[146526]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:06 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:06.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:06 np0005591762 python3.9[146678]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:06 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:06 np0005591762 python3.9[146831]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:07.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:07 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:07 np0005591762 python3.9[146984]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:07 np0005591762 python3.9[147136]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:08 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:08 np0005591762 python3.9[147288]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:08.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:08 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:08 np0005591762 python3.9[147440]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:09.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:09 np0005591762 python3.9[147593]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:09 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:09 np0005591762 python3.9[147746]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:43:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:10 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:10 np0005591762 python3.9[147898]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:10.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:10 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:10 np0005591762 python3.9[148075]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 04:43:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:11.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:11 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:11 np0005591762 python3.9[148231]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:43:11 np0005591762 systemd[1]: Reloading.
Jan 22 04:43:11 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:43:11 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:43:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:12 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180bf340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:12 np0005591762 python3.9[148418]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:12.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:12 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:12 np0005591762 python3.9[148571]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:12 np0005591762 python3.9[148725]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:13.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:13 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180bf340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:13 np0005591762 python3.9[148879]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:13 np0005591762 python3.9[149032]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:14 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:43:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:43:14 np0005591762 python3.9[149185]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:14 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:14 np0005591762 python3.9[149339]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:43:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:43:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:15.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:43:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:15 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140050f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:15 np0005591762 python3.9[149493]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 22 04:43:15 np0005591762 podman[149569]: 2026-01-22 09:43:15.956909717 +0000 UTC m=+0.052648383 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 04:43:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:16 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140050f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 22 04:43:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:16.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 22 04:43:16 np0005591762 python3.9[149731]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 04:43:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:16 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140050f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:17 np0005591762 python3.9[149900]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 04:43:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:17 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:43:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:17.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:17 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:17 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:43:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:17 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800cf90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:17 np0005591762 python3.9[150062]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:43:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:18 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140061f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:18.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:18 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140061f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:18 np0005591762 python3.9[150146]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:43:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:19.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:19 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c0320 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:20 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:43:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:43:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:20.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:43:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:20 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:21.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:21 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140061f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:22 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c0320 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:22.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:22 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:23.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:23 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094323 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:43:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:24 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c140061f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:24.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:24 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c12f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:24 np0005591762 podman[150187]: 2026-01-22 09:43:24.832305228 +0000 UTC m=+0.056243649 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 04:43:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:25.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:25 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c12f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:26 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c12f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:26.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:26 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008180 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:27.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:27 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:28 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:28.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:28 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:29.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:29 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:30 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c12f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:30.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:30 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:31.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:31 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:31 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:43:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:32 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:32.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:32 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:33.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:33 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:34 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:34.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:34 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:43:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:34 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:43:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:34 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:35.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:35 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:36 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:36.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:36 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:37.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:37 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:37 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:43:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:38 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:38.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:38 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:43:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:39.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:43:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:39 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:40 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:40.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:40 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c14008780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:41.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:41 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:42 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:42.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:42 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:43.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:43 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094343 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:43:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:44 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:44.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:44 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:45.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:45 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c100068d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:45 np0005591762 kernel: SELinux:  Converting 2780 SID table entries...
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:43:45 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:43:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:46 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:46.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:46 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:46 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 22 04:43:46 np0005591762 podman[150447]: 2026-01-22 09:43:46.816982523 +0000 UTC m=+0.037585195 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 04:43:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:47 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:43:47.185 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:43:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:43:47.185 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:43:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:43:47.185 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:43:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:48 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:48.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:48 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:49.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:49 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:50 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:50.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:50 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:51 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 04:43:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2588 writes, 14K keys, 2588 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s#012Cumulative WAL: 2588 writes, 2588 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2588 writes, 14K keys, 2588 commit groups, 1.0 writes per commit group, ingest: 39.37 MB, 0.07 MB/s#012Interval WAL: 2588 writes, 2588 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    495.3      0.05              0.03         6    0.008       0      0       0.0       0.0#012  L6      1/0   11.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   2.7    563.8    477.7      0.13              0.09         5    0.026     20K   2361       0.0       0.0#012 Sum      1/0   11.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.7    416.3    482.3      0.18              0.12        11    0.016     20K   2361       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.7    419.1    485.6      0.17              0.12        10    0.017     20K   2361       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    563.8    477.7      0.13              0.09         5    0.026     20K   2361       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    508.3      0.04              0.03         5    0.009       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.022, interval 0.022#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.2 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a025f49350#2 capacity: 304.00 MB usage: 2.21 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(155,2.01 MB,0.661383%) FilterBlock(11,64.05 KB,0.0205743%) IndexBlock(11,135.27 KB,0.0434524%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 22 04:43:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:52 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:52.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:52 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:52 np0005591762 kernel: SELinux:  Converting 2780 SID table entries...
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:43:52 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:43:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:43:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:53.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:43:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:53 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:54 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:54.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:54 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:55.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:55 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:55 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 22 04:43:55 np0005591762 podman[150506]: 2026-01-22 09:43:55.840883253 +0000 UTC m=+0.058173312 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:43:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:56 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:56.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:56 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:43:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:57.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:57 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:58 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:43:58.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:58 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:43:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:43:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:43:59.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:43:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:43:59 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:43:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:43:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:43:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:43:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:00 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:00.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:00 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:01.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:01 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:02 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:02.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:02 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:03.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:03 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:04 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:04.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:04 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf800a680 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:05.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:05 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bf0003f10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:06 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bfc005f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:06.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:06 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:07.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:07 np0005591762 kernel: ganesha.nfsd[148077]: segfault at 50 ip 00007f5c83e9c32e sp 00007f5bebff6210 error 4 in libntirpc.so.5.8[7f5c83e81000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 22 04:44:07 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:44:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[132761]: 22/01/2026 09:44:07 : epoch 6971f0dd : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c180c23f0 fd 47 proxy ignored for local
Jan 22 04:44:07 np0005591762 systemd[1]: Started Process Core Dump (PID 155261/UID 0).
Jan 22 04:44:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:08 np0005591762 systemd-coredump[155272]: Process 132772 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007f5c83e9c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f5c83ea6900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 22 04:44:08 np0005591762 systemd[1]: systemd-coredump@4-155261-0.service: Deactivated successfully.
Jan 22 04:44:08 np0005591762 systemd[1]: systemd-coredump@4-155261-0.service: Consumed 1.066s CPU time.
Jan 22 04:44:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:08.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:08 np0005591762 podman[156275]: 2026-01-22 09:44:08.346354374 +0000 UTC m=+0.016795398 container died afe0afb023a63d7e0b7f9bc5b6e24fe80b2cc6f5b44e1b52402d24e4a2736f4b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:44:08 np0005591762 systemd[1]: var-lib-containers-storage-overlay-53f838b35c7a6b18cc84d254555158c8eb346ad6decd63a5d403eb66cae96acf-merged.mount: Deactivated successfully.
Jan 22 04:44:08 np0005591762 podman[156275]: 2026-01-22 09:44:08.3825744 +0000 UTC m=+0.053015415 container remove afe0afb023a63d7e0b7f9bc5b6e24fe80b2cc6f5b44e1b52402d24e4a2736f4b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:44:08 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:44:08 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:44:08 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.080s CPU time.
Jan 22 04:44:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000012s ======
Jan 22 04:44:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:09.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Jan 22 04:44:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:10.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:11.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:12.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:44:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:13.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:44:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094413 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:44:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:14.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:15.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:16.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 04:44:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5918 writes, 25K keys, 5918 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5918 writes, 1040 syncs, 5.69 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5918 writes, 25K keys, 5918 commit groups, 1.0 writes per commit group, ingest: 19.22 MB, 0.03 MB/s#012Interval WAL: 5918 writes, 1040 syncs, 5.69 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 22 04:44:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:17.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:17 np0005591762 podman[165395]: 2026-01-22 09:44:17.815954428 +0000 UTC m=+0.040020856 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 04:44:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:18.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:18 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 5.
Jan 22 04:44:18 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:44:18 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.080s CPU time.
Jan 22 04:44:18 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:18 np0005591762 podman[166303]: 2026-01-22 09:44:18.668514605 +0000 UTC m=+0.038204181 container create f9e2faedce6d4623ec9beffdfc7b557a07f855dc16802d8cf93ecb407326acea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:44:18 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a7ee26d16fc4932c5af623a4a19e0017b7009c09c3606b7694b2c994cd78952/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:44:18 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a7ee26d16fc4932c5af623a4a19e0017b7009c09c3606b7694b2c994cd78952/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:44:18 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a7ee26d16fc4932c5af623a4a19e0017b7009c09c3606b7694b2c994cd78952/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:44:18 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a7ee26d16fc4932c5af623a4a19e0017b7009c09c3606b7694b2c994cd78952/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:44:18 np0005591762 podman[166303]: 2026-01-22 09:44:18.710228293 +0000 UTC m=+0.079917859 container init f9e2faedce6d4623ec9beffdfc7b557a07f855dc16802d8cf93ecb407326acea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 22 04:44:18 np0005591762 podman[166303]: 2026-01-22 09:44:18.714844099 +0000 UTC m=+0.084533665 container start f9e2faedce6d4623ec9beffdfc7b557a07f855dc16802d8cf93ecb407326acea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 22 04:44:18 np0005591762 bash[166303]: f9e2faedce6d4623ec9beffdfc7b557a07f855dc16802d8cf93ecb407326acea
Jan 22 04:44:18 np0005591762 podman[166303]: 2026-01-22 09:44:18.647665659 +0000 UTC m=+0.017355245 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:44:18 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:44:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:18 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:44:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:19.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:20.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:44:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:21.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:44:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:44:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:22.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:44:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:23.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:24.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:24 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:44:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:24 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:44:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:25.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:25 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:44:25 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:44:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:26.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:26 np0005591762 podman[167774]: 2026-01-22 09:44:26.838972153 +0000 UTC m=+0.058614578 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:44:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:27.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:44:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:28.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:44:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:29 np0005591762 kernel: SELinux:  Converting 2781 SID table entries...
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability open_perms=1
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability always_check_network=0
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 04:44:29 np0005591762 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 04:44:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:44:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:29.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:44:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:29 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:44:29 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 22 04:44:29 np0005591762 dbus-broker-launch[712]: Noticed file-system modification, trigger reload.
Jan 22 04:44:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:30.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:44:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:30 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:44:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:31.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:31 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:32 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:32.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:32 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71ec001ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:33.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094433 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:44:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:33 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:34 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc002f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:34.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:34 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc002f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:35 np0005591762 systemd[1]: Stopping OpenSSH server daemon...
Jan 22 04:44:35 np0005591762 systemd[1]: sshd.service: Deactivated successfully.
Jan 22 04:44:35 np0005591762 systemd[1]: Stopped OpenSSH server daemon.
Jan 22 04:44:35 np0005591762 systemd[1]: sshd.service: Consumed 1.393s CPU time, read 32.0K from disk, written 0B to disk.
Jan 22 04:44:35 np0005591762 systemd[1]: Stopped target sshd-keygen.target.
Jan 22 04:44:35 np0005591762 systemd[1]: Stopping sshd-keygen.target...
Jan 22 04:44:35 np0005591762 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 04:44:35 np0005591762 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 04:44:35 np0005591762 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 04:44:35 np0005591762 systemd[1]: Reached target sshd-keygen.target.
Jan 22 04:44:35 np0005591762 systemd[1]: Starting OpenSSH server daemon...
Jan 22 04:44:35 np0005591762 systemd[1]: Started OpenSSH server daemon.
Jan 22 04:44:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:35.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:35 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71ec0029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:36 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:36 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:44:36 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:44:36 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:36.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:36 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:36 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:36 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc002f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:36 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:44:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:37.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:37 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc002f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:38 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71ec0029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:38.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:38 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e80091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:39 np0005591762 python3.9[173351]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:44:39 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:44:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:39.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:44:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:39 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc002eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:39 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:39 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:39 np0005591762 python3.9[174762]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:44:39 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:40 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc002eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:40 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:40 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:40.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:40 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71ec0029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:40 np0005591762 python3.9[176082]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:44:40 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:40 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:40 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:41.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:41 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8009330 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:41 np0005591762 python3.9[177445]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:44:41 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:41 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:41 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:41 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:44:41 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:44:41 np0005591762 systemd[1]: man-db-cache-update.service: Consumed 6.998s CPU time.
Jan 22 04:44:41 np0005591762 systemd[1]: run-r085318f1f96f46d2969b441092c1f5e8.service: Deactivated successfully.
Jan 22 04:44:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:42 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:42.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:42 np0005591762 python3.9[178312]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:42 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:42 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:42 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:42 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:43.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:43 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:43 np0005591762 python3.9[178504]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:43 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:43 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:43 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:44 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8009c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:44 np0005591762 python3.9[178694]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:44 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:44 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:44 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:44.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:44 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8009c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:44 np0005591762 python3.9[178886]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:45.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:45 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:45 np0005591762 python3.9[179042]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:45 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:45 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:45 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:46 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e00035c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:46.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:46 np0005591762 python3.9[179233]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 04:44:46 np0005591762 systemd[1]: Reloading.
Jan 22 04:44:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:46 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:44:46 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:44:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:46 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8009c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:46 np0005591762 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 22 04:44:46 np0005591762 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 22 04:44:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:47.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:44:47.186 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:44:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:44:47.186 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:44:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:44:47.186 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:44:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:47 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e8009c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:47 np0005591762 python3.9[179427]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:47 np0005591762 python3.9[179582]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:47 np0005591762 podman[179584]: 2026-01-22 09:44:47.988182115 +0000 UTC m=+0.038544121 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:44:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:48 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:48.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:48 np0005591762 python3.9[179753]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:48 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0003740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:49 np0005591762 python3.9[179909]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:49.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:49 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800b650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:49 np0005591762 python3.9[180065]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:50 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800b650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:50 np0005591762 python3.9[180220]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:50.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:50 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:50 np0005591762 python3.9[180375]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:51.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:51 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:51 np0005591762 python3.9[180556]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:51 np0005591762 python3.9[180712]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:52 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800b650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:52.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:52 np0005591762 python3.9[180867]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:52 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800b650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:52 np0005591762 python3.9[181023]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:53.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:53 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:53 np0005591762 python3.9[181179]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:54 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:54 np0005591762 python3.9[181334]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:54.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:54 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800b650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:54 np0005591762 python3.9[181489]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 04:44:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:55.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:55 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800b650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:55 np0005591762 python3.9[181646]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:44:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:55 np0005591762 python3.9[181798]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:44:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:56 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:56 np0005591762 python3.9[181950]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:44:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:56.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:56 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:56 np0005591762 python3.9[182103]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:44:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:44:57 np0005591762 podman[182227]: 2026-01-22 09:44:57.161134442 +0000 UTC m=+0.088433381 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 04:44:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:57.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:57 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:57 np0005591762 python3.9[182275]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:44:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:57 np0005591762 python3.9[182431]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:44:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:58 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800d160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:58 np0005591762 python3.9[182581]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:44:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:44:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:44:58.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:44:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:58 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:58 np0005591762 python3.9[182734]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:44:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:44:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:44:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:44:59.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:44:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:44:59 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:44:59 np0005591762 python3.9[182860]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075098.422088-1644-41827396185125/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:44:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:44:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:44:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:44:59 np0005591762 python3.9[183012]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:45:00 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:00 np0005591762 python3.9[183137]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075099.5007548-1644-205348495079491/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:00.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:45:00 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71e800d160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:00 np0005591762 python3.9[183289]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:01 np0005591762 python3.9[183415]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075100.3180103-1644-195215932630518/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:01.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:01 np0005591762 kernel: ganesha.nfsd[167889]: segfault at 50 ip 00007f727426f32e sp 00007f71f9ffa210 error 4 in libntirpc.so.5.8[7f7274254000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 22 04:45:01 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:45:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[166377]: 22/01/2026 09:45:01 : epoch 6971f172 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f71dc005620 fd 38 proxy ignored for local
Jan 22 04:45:01 np0005591762 systemd[1]: Started Process Core Dump (PID 183487/UID 0).
Jan 22 04:45:01 np0005591762 python3.9[183570]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:01 np0005591762 python3.9[183695]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075101.1816862-1644-221116909291093/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:02 np0005591762 systemd-coredump[183494]: Process 166405 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 44:#012#0  0x00007f727426f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:45:02 np0005591762 systemd[1]: systemd-coredump@5-183487-0.service: Deactivated successfully.
Jan 22 04:45:02 np0005591762 podman[183852]: 2026-01-22 09:45:02.332184227 +0000 UTC m=+0.022927305 container died f9e2faedce6d4623ec9beffdfc7b557a07f855dc16802d8cf93ecb407326acea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:45:02 np0005591762 systemd[1]: var-lib-containers-storage-overlay-0a7ee26d16fc4932c5af623a4a19e0017b7009c09c3606b7694b2c994cd78952-merged.mount: Deactivated successfully.
Jan 22 04:45:02 np0005591762 podman[183852]: 2026-01-22 09:45:02.352209909 +0000 UTC m=+0.042952977 container remove f9e2faedce6d4623ec9beffdfc7b557a07f855dc16802d8cf93ecb407326acea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:45:02 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:45:02 np0005591762 python3.9[183848]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:02.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:02 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:45:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:02 np0005591762 python3.9[184010]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075102.0613997-1644-21943694413092/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:03.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:03 np0005591762 python3.9[184163]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:03 np0005591762 python3.9[184288]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075102.912726-1644-281005906488535/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:04 np0005591762 python3.9[184440]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:04.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:04 np0005591762 python3.9[184563]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075103.784616-1644-144346000205284/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:04 np0005591762 python3.9[184716]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:05.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:05 np0005591762 python3.9[184842]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769075104.623577-1644-96924835275052/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:05 np0005591762 python3.9[184994]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 22 04:45:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:06.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:06 np0005591762 python3.9[185147]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:07 np0005591762 python3.9[185300]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:07.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094507 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:45:07 np0005591762 python3.9[185453]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:07 np0005591762 python3.9[185605]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:08 np0005591762 python3.9[185757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:08.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:08 np0005591762 python3.9[185910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:09 np0005591762 python3.9[186062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:09.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:09 np0005591762 python3.9[186215]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:10 np0005591762 python3.9[186367]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:10.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:10 np0005591762 python3.9[186519]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:10 np0005591762 python3.9[186697]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:11.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:11 np0005591762 python3.9[186850]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:11 np0005591762 python3.9[187002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:12 np0005591762 python3.9[187154]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:12.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:12 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 6.
Jan 22 04:45:12 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:45:12 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:12 np0005591762 podman[187216]: 2026-01-22 09:45:12.654718351 +0000 UTC m=+0.027866662 container create ba603b3ec1670262de33164f7fc30dbe2aec300199fdefda54c8a77ebb4aee93 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:45:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05589f9e9a66e7ea6c40e1815a86f5abb0cdb775f4eeedf5d8dc9d336b3e33cf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05589f9e9a66e7ea6c40e1815a86f5abb0cdb775f4eeedf5d8dc9d336b3e33cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05589f9e9a66e7ea6c40e1815a86f5abb0cdb775f4eeedf5d8dc9d336b3e33cf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05589f9e9a66e7ea6c40e1815a86f5abb0cdb775f4eeedf5d8dc9d336b3e33cf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:12 np0005591762 podman[187216]: 2026-01-22 09:45:12.698328136 +0000 UTC m=+0.071476468 container init ba603b3ec1670262de33164f7fc30dbe2aec300199fdefda54c8a77ebb4aee93 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:45:12 np0005591762 podman[187216]: 2026-01-22 09:45:12.704600989 +0000 UTC m=+0.077749300 container start ba603b3ec1670262de33164f7fc30dbe2aec300199fdefda54c8a77ebb4aee93 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 04:45:12 np0005591762 bash[187216]: ba603b3ec1670262de33164f7fc30dbe2aec300199fdefda54c8a77ebb4aee93
Jan 22 04:45:12 np0005591762 podman[187216]: 2026-01-22 09:45:12.6436208 +0000 UTC m=+0.016769110 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:45:12 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:45:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:12 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:45:13 np0005591762 python3.9[187397]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:13.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:13 np0005591762 python3.9[187521]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075112.77589-2306-255705268812907/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:14 np0005591762 python3.9[187673]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:14.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:14 np0005591762 python3.9[187796]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075113.7812726-2306-231784087452690/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:15 np0005591762 python3.9[187949]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:15.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:15 np0005591762 python3.9[188073]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075114.659916-2306-254919458132946/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:15 np0005591762 python3.9[188225]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:16 np0005591762 python3.9[188348]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075115.5171762-2306-85602937727484/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:16.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:16 np0005591762 python3.9[188501]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:17 np0005591762 python3.9[188624]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075116.3740203-2306-103333452157222/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:17.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:17 np0005591762 python3.9[188777]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:17 np0005591762 python3.9[188900]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075117.229154-2306-160520373202523/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:18 np0005591762 podman[189024]: 2026-01-22 09:45:18.231099593 +0000 UTC m=+0.042280520 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 04:45:18 np0005591762 python3.9[189068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:18.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:18 np0005591762 python3.9[189193]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075118.040747-2306-119144046978083/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:18 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:45:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:18 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:45:19 np0005591762 python3.9[189345]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:19 np0005591762 python3.9[189469]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075118.8580208-2306-168006005740205/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:20 np0005591762 python3.9[189621]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:20.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:20 np0005591762 python3.9[189744]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075119.6971097-2306-151653836908572/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:20 np0005591762 python3.9[189897]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:21 np0005591762 python3.9[190021]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075120.5367084-2306-162758207741659/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:21 np0005591762 python3.9[190173]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:22 np0005591762 python3.9[190296]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075121.3564541-2306-158139883222193/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:22.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:22 np0005591762 python3.9[190448]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:22 np0005591762 python3.9[190572]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075122.148285-2306-155102720035717/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:23.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:23 np0005591762 python3.9[190725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:23 np0005591762 python3.9[190848]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075122.9633152-2306-50053470468445/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:24 np0005591762 python3.9[191000]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:24.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:24 np0005591762 python3.9[191123]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075123.9205408-2306-91543933396914/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:45:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:24 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:45:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:25.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:25 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0010000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:25 np0005591762 python3.9[191370]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:26 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00040013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:45:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:45:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:45:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:45:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:26.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:26 np0005591762 python3.9[191525]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 22 04:45:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:26 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0008001e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:27.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094527 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:45:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:27 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0010000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:27 np0005591762 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 22 04:45:27 np0005591762 podman[191655]: 2026-01-22 09:45:27.752234184 +0000 UTC m=+0.065916801 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 04:45:27 np0005591762 python3.9[191698]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:28 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efffc001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:28 np0005591762 python3.9[191857]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:28.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:28 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0010000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:28 np0005591762 python3.9[192010]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:29 np0005591762 python3.9[192162]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:29.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[187228]: 22/01/2026 09:45:29 : epoch 6971f1a8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0010000df0 fd 39 proxy ignored for local
Jan 22 04:45:29 np0005591762 kernel: ganesha.nfsd[191150]: segfault at 50 ip 00007f009381232e sp 00007f001affc210 error 4 in libntirpc.so.5.8[7f00937f7000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 22 04:45:29 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:45:29 np0005591762 systemd[1]: Started Process Core Dump (PID 192210/UID 0).
Jan 22 04:45:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:29 np0005591762 python3.9[192342]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:45:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:45:30 np0005591762 systemd-coredump[192214]: Process 187232 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007f009381232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:45:30 np0005591762 systemd[1]: systemd-coredump@6-192210-0.service: Deactivated successfully.
Jan 22 04:45:30 np0005591762 systemd[1]: systemd-coredump@6-192210-0.service: Consumed 1.071s CPU time.
Jan 22 04:45:30 np0005591762 python3.9[192494]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:30 np0005591762 podman[192499]: 2026-01-22 09:45:30.399863754 +0000 UTC m=+0.023053287 container died ba603b3ec1670262de33164f7fc30dbe2aec300199fdefda54c8a77ebb4aee93 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 22 04:45:30 np0005591762 systemd[1]: var-lib-containers-storage-overlay-05589f9e9a66e7ea6c40e1815a86f5abb0cdb775f4eeedf5d8dc9d336b3e33cf-merged.mount: Deactivated successfully.
Jan 22 04:45:30 np0005591762 podman[192499]: 2026-01-22 09:45:30.418936296 +0000 UTC m=+0.042125829 container remove ba603b3ec1670262de33164f7fc30dbe2aec300199fdefda54c8a77ebb4aee93 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 22 04:45:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:30.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:30 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:45:30 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:45:30 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.008s CPU time.
Jan 22 04:45:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:30 np0005591762 python3.9[192683]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:31.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:31 np0005591762 python3.9[192861]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:31 np0005591762 python3.9[193013]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:32 np0005591762 python3.9[193165]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:32.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:32 np0005591762 python3.9[193318]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:45:32 np0005591762 systemd[1]: Reloading.
Jan 22 04:45:32 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:45:32 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:45:33 np0005591762 systemd[1]: Starting libvirt logging daemon socket...
Jan 22 04:45:33 np0005591762 systemd[1]: Listening on libvirt logging daemon socket.
Jan 22 04:45:33 np0005591762 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 22 04:45:33 np0005591762 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 22 04:45:33 np0005591762 systemd[1]: Starting libvirt logging daemon...
Jan 22 04:45:33 np0005591762 systemd[1]: Started libvirt logging daemon.
Jan 22 04:45:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:33.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:33 np0005591762 python3.9[193511]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:45:33 np0005591762 systemd[1]: Reloading.
Jan 22 04:45:34 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:45:34 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:45:34 np0005591762 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 22 04:45:34 np0005591762 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 22 04:45:34 np0005591762 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 22 04:45:34 np0005591762 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 22 04:45:34 np0005591762 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 22 04:45:34 np0005591762 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 22 04:45:34 np0005591762 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 04:45:34 np0005591762 systemd[1]: Started libvirt nodedev daemon.
Jan 22 04:45:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:34.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:34 np0005591762 python3.9[193728]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:45:34 np0005591762 systemd[1]: Reloading.
Jan 22 04:45:34 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:45:34 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:45:35 np0005591762 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 22 04:45:35 np0005591762 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 22 04:45:35 np0005591762 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt proxy daemon...
Jan 22 04:45:35 np0005591762 systemd[1]: Started libvirt proxy daemon.
Jan 22 04:45:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:45:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:35.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:45:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094535 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:45:35 np0005591762 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 22 04:45:35 np0005591762 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 22 04:45:35 np0005591762 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 22 04:45:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:35 np0005591762 python3.9[193944]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:45:35 np0005591762 systemd[1]: Reloading.
Jan 22 04:45:35 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:45:35 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:45:35 np0005591762 systemd[1]: Listening on libvirt locking daemon socket.
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 22 04:45:35 np0005591762 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 22 04:45:35 np0005591762 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 22 04:45:35 np0005591762 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 22 04:45:35 np0005591762 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 22 04:45:35 np0005591762 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 22 04:45:35 np0005591762 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 22 04:45:35 np0005591762 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 04:45:35 np0005591762 systemd[1]: Started libvirt QEMU daemon.
Jan 22 04:45:36 np0005591762 setroubleshoot[193765]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 46b36540-5ec1-47b0-b736-925d1a3f3d6b
Jan 22 04:45:36 np0005591762 setroubleshoot[193765]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 22 04:45:36 np0005591762 setroubleshoot[193765]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 46b36540-5ec1-47b0-b736-925d1a3f3d6b
Jan 22 04:45:36 np0005591762 setroubleshoot[193765]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 22 04:45:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:36.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:36 np0005591762 python3.9[194166]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:45:36 np0005591762 systemd[1]: Reloading.
Jan 22 04:45:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:36 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:45:36 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:45:36 np0005591762 systemd[1]: Starting libvirt secret daemon socket...
Jan 22 04:45:36 np0005591762 systemd[1]: Listening on libvirt secret daemon socket.
Jan 22 04:45:36 np0005591762 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 22 04:45:36 np0005591762 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 22 04:45:36 np0005591762 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 22 04:45:36 np0005591762 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 22 04:45:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:36 np0005591762 systemd[1]: Starting libvirt secret daemon...
Jan 22 04:45:36 np0005591762 systemd[1]: Started libvirt secret daemon.
Jan 22 04:45:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:45:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:37.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:45:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:37 np0005591762 python3.9[194381]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:38 np0005591762 python3.9[194533]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 04:45:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:38.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:38 np0005591762 python3.9[194685]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:39.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:39 np0005591762 python3.9[194841]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 04:45:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:39 np0005591762 python3.9[194991]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:40 np0005591762 python3.9[195112]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075139.5738223-3380-261290122294650/.source.xml follow=False _original_basename=secret.xml.j2 checksum=ee8dac29edb10d989fc7d8a43619a77a19a44d77 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:40.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:40 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 7.
Jan 22 04:45:40 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:45:40 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.008s CPU time.
Jan 22 04:45:40 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:45:40 np0005591762 podman[195304]: 2026-01-22 09:45:40.761124589 +0000 UTC m=+0.029183115 container create bb66cd37179fccc78f2183aee4fa1a3327426aa6e023090d182f72162b129d29 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 22 04:45:40 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36926eb7c0967c0ecca10a03f7e34f88b294ad3e3efd5c94efe557b9ccbd8aee/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:40 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36926eb7c0967c0ecca10a03f7e34f88b294ad3e3efd5c94efe557b9ccbd8aee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:40 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36926eb7c0967c0ecca10a03f7e34f88b294ad3e3efd5c94efe557b9ccbd8aee/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:40 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36926eb7c0967c0ecca10a03f7e34f88b294ad3e3efd5c94efe557b9ccbd8aee/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:45:40 np0005591762 python3.9[195277]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 43df7a30-cf5f-5209-adfd-bf44298b19f2#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:40 np0005591762 podman[195304]: 2026-01-22 09:45:40.813044619 +0000 UTC m=+0.081103156 container init bb66cd37179fccc78f2183aee4fa1a3327426aa6e023090d182f72162b129d29 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:45:40 np0005591762 podman[195304]: 2026-01-22 09:45:40.818299601 +0000 UTC m=+0.086358128 container start bb66cd37179fccc78f2183aee4fa1a3327426aa6e023090d182f72162b129d29 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 04:45:40 np0005591762 bash[195304]: bb66cd37179fccc78f2183aee4fa1a3327426aa6e023090d182f72162b129d29
Jan 22 04:45:40 np0005591762 podman[195304]: 2026-01-22 09:45:40.748892526 +0000 UTC m=+0.016951073 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:45:40 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:45:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:40 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:45:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:41.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:41 np0005591762 python3.9[195520]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:42.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:42 np0005591762 python3.9[195984]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:43.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:43 np0005591762 python3.9[196137]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:43 np0005591762 python3.9[196260]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075143.1444986-3546-14235391223978/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:44.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:44 np0005591762 python3.9[196412]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:44 np0005591762 auditd[675]: Audit daemon rotating log files
Jan 22 04:45:44 np0005591762 python3.9[196565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:45.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:45 np0005591762 python3.9[196644]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:45 np0005591762 python3.9[196796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:46 np0005591762 python3.9[196874]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.kk92iu23 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:46 np0005591762 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 22 04:45:46 np0005591762 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 22 04:45:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:46.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:46 np0005591762 python3.9[197026]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:46 np0005591762 python3.9[197105]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:46 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:45:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:46 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:45:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:45:47.187 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:45:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:45:47.187 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:45:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:45:47.187 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:45:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:47.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:47 np0005591762 python3.9[197258]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:48 np0005591762 python3[197411]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 04:45:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:48.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:48 np0005591762 podman[197535]: 2026-01-22 09:45:48.445439311 +0000 UTC m=+0.042090251 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 04:45:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:48 np0005591762 python3.9[197580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:49 np0005591762 python3.9[197659]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:49.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:49 np0005591762 python3.9[197812]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:50 np0005591762 python3.9[197937]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075149.2188416-3812-135238808821079/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:50.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:50 np0005591762 python3.9[198089]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:50 np0005591762 python3.9[198168]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:51.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:51 np0005591762 python3.9[198346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:51 np0005591762 python3.9[198424]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:52 np0005591762 python3.9[198576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:52.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:52 np0005591762 python3.9[198702]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769075151.896532-3930-163875906203997/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:45:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:52 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:45:53 np0005591762 python3.9[198867]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:53 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2820000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:53.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:53 np0005591762 python3.9[199023]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:54 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2818001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:54 np0005591762 python3.9[199178]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:54.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:54 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2818001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:45:54 np0005591762 python3.9[199331]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:55 np0005591762 kernel: ganesha.nfsd[198809]: segfault at 50 ip 00007f28a9dce32e sp 00007f281e7fb210 error 4 in libntirpc.so.5.8[7f28a9db3000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 22 04:45:55 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:45:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[195316]: 22/01/2026 09:45:55 : epoch 6971f1c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55e28238c870 fd 38 proxy ignored for local
Jan 22 04:45:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094555 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:45:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:55 np0005591762 systemd[1]: Started Process Core Dump (PID 199486/UID 0).
Jan 22 04:45:55 np0005591762 python3.9[199485]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:45:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:55 np0005591762 python3.9[199641]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:45:56 np0005591762 systemd-coredump[199487]: Process 195322 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 47:#012#0  0x00007f28a9dce32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:45:56 np0005591762 systemd[1]: systemd-coredump@7-199486-0.service: Deactivated successfully.
Jan 22 04:45:56 np0005591762 systemd[1]: systemd-coredump@7-199486-0.service: Consumed 1.051s CPU time.
Jan 22 04:45:56 np0005591762 python3.9[199796]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:56 np0005591762 podman[199801]: 2026-01-22 09:45:56.404199678 +0000 UTC m=+0.023029559 container died bb66cd37179fccc78f2183aee4fa1a3327426aa6e023090d182f72162b129d29 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Jan 22 04:45:56 np0005591762 systemd[1]: var-lib-containers-storage-overlay-36926eb7c0967c0ecca10a03f7e34f88b294ad3e3efd5c94efe557b9ccbd8aee-merged.mount: Deactivated successfully.
Jan 22 04:45:56 np0005591762 podman[199801]: 2026-01-22 09:45:56.423008736 +0000 UTC m=+0.041838607 container remove bb66cd37179fccc78f2183aee4fa1a3327426aa6e023090d182f72162b129d29 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:45:56 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:45:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:45:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:56.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:45:56 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:45:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:45:56 np0005591762 python3.9[199984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:45:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:57.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:45:57 np0005591762 python3.9[200108]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075156.5408206-4146-38367807560841/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:57 np0005591762 python3.9[200260]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:57 np0005591762 podman[200261]: 2026-01-22 09:45:57.849002562 +0000 UTC m=+0.066380787 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 04:45:58 np0005591762 python3.9[200406]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075157.4271436-4191-179881764859821/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:45:58.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:58 np0005591762 python3.9[200559]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:45:59 np0005591762 python3.9[200682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075158.378766-4236-182825547500113/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:45:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:45:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:45:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:45:59.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:45:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:45:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:45:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:45:59 np0005591762 python3.9[200835]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:45:59 np0005591762 systemd[1]: Reloading.
Jan 22 04:45:59 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:45:59 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:00 np0005591762 systemd[1]: Reached target edpm_libvirt.target.
Jan 22 04:46:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:46:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:00.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:46:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:00 np0005591762 python3.9[201028]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 04:46:00 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:00 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:00 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:01 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:01 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:01 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094601 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:46:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:01.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:01 np0005591762 systemd[1]: session-51.scope: Deactivated successfully.
Jan 22 04:46:01 np0005591762 systemd[1]: session-51.scope: Consumed 2min 24.864s CPU time.
Jan 22 04:46:01 np0005591762 systemd-logind[744]: Session 51 logged out. Waiting for processes to exit.
Jan 22 04:46:01 np0005591762 systemd-logind[744]: Removed session 51.
Jan 22 04:46:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:02.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:03.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:04.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:05.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:06.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:06 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 8.
Jan 22 04:46:06 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:46:06 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:46:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:06 np0005591762 podman[201169]: 2026-01-22 09:46:06.901140553 +0000 UTC m=+0.024376429 container create 32be0d91595604d10891c5dd23870e4e9b4af24bc5029857cb075063908f7247 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:46:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6dcd18f0b2a1a827720e15e00c0dfca0e6d7fac9098c2c79db6bc35b2756e5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:46:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6dcd18f0b2a1a827720e15e00c0dfca0e6d7fac9098c2c79db6bc35b2756e5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:46:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6dcd18f0b2a1a827720e15e00c0dfca0e6d7fac9098c2c79db6bc35b2756e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:46:06 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6dcd18f0b2a1a827720e15e00c0dfca0e6d7fac9098c2c79db6bc35b2756e5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:46:06 np0005591762 podman[201169]: 2026-01-22 09:46:06.939659366 +0000 UTC m=+0.062895273 container init 32be0d91595604d10891c5dd23870e4e9b4af24bc5029857cb075063908f7247 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:46:06 np0005591762 podman[201169]: 2026-01-22 09:46:06.943792131 +0000 UTC m=+0.067028008 container start 32be0d91595604d10891c5dd23870e4e9b4af24bc5029857cb075063908f7247 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:46:06 np0005591762 bash[201169]: 32be0d91595604d10891c5dd23870e4e9b4af24bc5029857cb075063908f7247
Jan 22 04:46:06 np0005591762 podman[201169]: 2026-01-22 09:46:06.890823092 +0000 UTC m=+0.014058969 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:46:06 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:46:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:46:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:07 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:46:07 np0005591762 systemd-logind[744]: New session 52 of user zuul.
Jan 22 04:46:07 np0005591762 systemd[1]: Started Session 52 of User zuul.
Jan 22 04:46:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:07.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:08 np0005591762 python3.9[201376]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:46:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:08.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:09 np0005591762 python3.9[201531]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:46:09 np0005591762 network[201548]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:46:09 np0005591762 network[201549]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:46:09 np0005591762 network[201550]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:46:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:09.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:10.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:11.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:12.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:12 np0005591762 python3.9[201850]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 04:46:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:13 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:46:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:13 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:46:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:13.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:13 np0005591762 python3.9[201936]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:46:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:14.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:15.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:16.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:17.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:18 np0005591762 python3.9[202093]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:46:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:18.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094618 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:46:18 np0005591762 podman[202194]: 2026-01-22 09:46:18.819084761 +0000 UTC m=+0.042909516 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 04:46:18 np0005591762 python3.9[202261]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:19 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f54000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:19.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:19 np0005591762 python3.9[202429]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:20 np0005591762 python3.9[202581]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:46:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:20 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:20.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:20 np0005591762 python3.9[202734]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:46:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:20 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:20 np0005591762 python3.9[202858]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075180.192766-243-45793349157876/.source.iscsi _original_basename=.ok2hw267 follow=False checksum=496ad9d66788939bb63425550a56dd83595bac24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094621 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:46:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:21 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f58001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:21.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:21 np0005591762 python3.9[203011]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:22 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f500034a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:22 np0005591762 python3.9[203163]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:22.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:22 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:23 np0005591762 python3.9[203316]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:23 np0005591762 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 22 04:46:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:23 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:23.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:23 np0005591762 python3.9[203473]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:23 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:23 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:23 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:24 np0005591762 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 04:46:24 np0005591762 systemd[1]: Starting Open-iSCSI...
Jan 22 04:46:24 np0005591762 kernel: Loading iSCSI transport class v2.0-870.
Jan 22 04:46:24 np0005591762 systemd[1]: Started Open-iSCSI.
Jan 22 04:46:24 np0005591762 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 22 04:46:24 np0005591762 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 22 04:46:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:24 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:24.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:24 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f50003dc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:24 np0005591762 python3.9[203672]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:46:24 np0005591762 network[203689]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:46:24 np0005591762 network[203690]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:46:24 np0005591762 network[203691]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:46:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:25 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:25.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:26 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:26 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:46:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:26.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:26 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:27 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5000a030 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:27.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:27 np0005591762 python3.9[203966]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:46:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:28 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:28.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:28 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:28 np0005591762 podman[203970]: 2026-01-22 09:46:28.835869537 +0000 UTC m=+0.058972487 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:46:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:29 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:29.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:29 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:46:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:29 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:46:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:29 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:46:29 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:46:29 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:29 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:29 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:29 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:46:29 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:46:29 np0005591762 systemd[1]: run-r0a29c7d13de34484915c64b7b6814b28.service: Deactivated successfully.
Jan 22 04:46:29 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:46:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:30 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5000a450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:46:30 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:46:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:30.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:30 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:31 np0005591762 python3.9[204393]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 04:46:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:31 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:31.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:31 np0005591762 python3.9[204565]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 22 04:46:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:32 np0005591762 python3.9[204721]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:46:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:32 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:32 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:46:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:32.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:32 np0005591762 python3.9[204844]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075191.8165133-507-168761529479929/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:32 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:33 np0005591762 python3.9[204997]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:33 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:33 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:46:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:33.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:33 np0005591762 python3.9[205175]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:46:33 np0005591762 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 04:46:33 np0005591762 systemd[1]: Stopped Load Kernel Modules.
Jan 22 04:46:33 np0005591762 systemd[1]: Stopping Load Kernel Modules...
Jan 22 04:46:34 np0005591762 systemd[1]: Starting Load Kernel Modules...
Jan 22 04:46:34 np0005591762 systemd[1]: Finished Load Kernel Modules.
Jan 22 04:46:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:34 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:34 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:46:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:34.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:34 np0005591762 python3.9[205331]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:46:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:34 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:35 np0005591762 python3.9[205485]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:46:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:35 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:35.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:35 np0005591762 python3.9[205638]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:46:36 np0005591762 python3.9[205761]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075195.3823671-660-158956771874544/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:36 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:36.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:36 np0005591762 python3.9[205913]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:46:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:36 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:37 np0005591762 python3.9[206067]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:37 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:37.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:37 np0005591762 python3.9[206220]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:38 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:38 np0005591762 python3.9[206372]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:38.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:38 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:38 np0005591762 python3.9[206525]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094638 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:46:39 np0005591762 python3.9[206677]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:39 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:39.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:39 np0005591762 python3.9[206830]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:39 np0005591762 python3.9[206982]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:40 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:40 np0005591762 python3.9[207134]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:46:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:40.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:40 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:40 np0005591762 python3.9[207289]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:46:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:41 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:41.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:41 np0005591762 python3.9[207443]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:41 np0005591762 systemd[1]: Listening on multipathd control socket.
Jan 22 04:46:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:42 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f50009020 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:42 np0005591762 python3.9[207599]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:42 np0005591762 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 22 04:46:42 np0005591762 udevadm[207604]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 22 04:46:42 np0005591762 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 22 04:46:42 np0005591762 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 04:46:42 np0005591762 multipathd[207607]: --------start up--------
Jan 22 04:46:42 np0005591762 multipathd[207607]: read /etc/multipath.conf
Jan 22 04:46:42 np0005591762 multipathd[207607]: path checkers start up
Jan 22 04:46:42 np0005591762 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 04:46:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:42.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:42 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f50009020 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:43 np0005591762 python3.9[207767]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 04:46:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:43 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0050c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:43.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:43 np0005591762 python3.9[207920]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 22 04:46:43 np0005591762 kernel: Key type psk registered
Jan 22 04:46:44 np0005591762 python3.9[208082]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:46:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:44 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:44.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:44 np0005591762 python3.9[208205]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769075203.816391-1050-127678718756544/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:44 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f50009020 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:45 np0005591762 python3.9[208358]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:45 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:45.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:45 np0005591762 python3.9[208511]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:46:45 np0005591762 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 04:46:45 np0005591762 systemd[1]: Stopped Load Kernel Modules.
Jan 22 04:46:45 np0005591762 systemd[1]: Stopping Load Kernel Modules...
Jan 22 04:46:45 np0005591762 systemd[1]: Starting Load Kernel Modules...
Jan 22 04:46:45 np0005591762 systemd[1]: Finished Load Kernel Modules.
Jan 22 04:46:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:46 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:46 np0005591762 python3.9[208667]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 04:46:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:46.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:46 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:46:47.187 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:46:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:46:47.188 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:46:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:46:47.188 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:46:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:47 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f50009020 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:47.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:48 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:48 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:48 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:48 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:48 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:48 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:48 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:48.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:48 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:48 np0005591762 systemd-logind[744]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 04:46:48 np0005591762 systemd-logind[744]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 04:46:48 np0005591762 lvm[208783]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 04:46:48 np0005591762 lvm[208783]: VG ceph_vg0 finished
Jan 22 04:46:48 np0005591762 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 04:46:48 np0005591762 systemd[1]: Starting man-db-cache-update.service...
Jan 22 04:46:48 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:48 np0005591762 podman[208788]: 2026-01-22 09:46:48.93407488 +0000 UTC m=+0.089704073 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:46:49 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:49 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:49 np0005591762 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 04:46:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:49 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:49.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:49 np0005591762 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 04:46:49 np0005591762 systemd[1]: Finished man-db-cache-update.service.
Jan 22 04:46:49 np0005591762 systemd[1]: man-db-cache-update.service: Consumed 1.097s CPU time.
Jan 22 04:46:49 np0005591762 systemd[1]: run-rd4f2f11c1fc54ecb8efcd9eb02a167f1.service: Deactivated successfully.
Jan 22 04:46:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:50 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f50009020 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:50.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:50 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:50 np0005591762 python3.9[210171]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:46:51 np0005591762 systemd[1]: Stopping Open-iSCSI...
Jan 22 04:46:51 np0005591762 iscsid[203513]: iscsid shutting down.
Jan 22 04:46:51 np0005591762 systemd[1]: iscsid.service: Deactivated successfully.
Jan 22 04:46:51 np0005591762 systemd[1]: Stopped Open-iSCSI.
Jan 22 04:46:51 np0005591762 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 04:46:51 np0005591762 systemd[1]: Starting Open-iSCSI...
Jan 22 04:46:51 np0005591762 systemd[1]: Started Open-iSCSI.
Jan 22 04:46:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:51 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:51.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:51 np0005591762 python3.9[210353]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:46:51 np0005591762 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 22 04:46:51 np0005591762 multipathd[207607]: exit (signal)
Jan 22 04:46:51 np0005591762 multipathd[207607]: --------shut down-------
Jan 22 04:46:51 np0005591762 systemd[1]: multipathd.service: Deactivated successfully.
Jan 22 04:46:51 np0005591762 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 22 04:46:51 np0005591762 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 04:46:51 np0005591762 multipathd[210359]: --------start up--------
Jan 22 04:46:51 np0005591762 multipathd[210359]: read /etc/multipath.conf
Jan 22 04:46:51 np0005591762 multipathd[210359]: path checkers start up
Jan 22 04:46:51 np0005591762 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 04:46:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094652 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:46:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:52 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:52 np0005591762 python3.9[210516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 04:46:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:52.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:52 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f60002600 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094652 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:46:53 np0005591762 python3.9[210673]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:46:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:53 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:53.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:53 np0005591762 python3.9[210826]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:46:53 np0005591762 systemd[1]: Reloading.
Jan 22 04:46:54 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:46:54 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:46:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:54 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c001bc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:54.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:54 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:54 np0005591762 python3.9[211011]: ansible-ansible.builtin.service_facts Invoked
Jan 22 04:46:54 np0005591762 network[211028]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 04:46:54 np0005591762 network[211029]: 'network-scripts' will be removed from distribution in near future.
Jan 22 04:46:54 np0005591762 network[211030]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 04:46:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:55 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:55.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:56 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:56.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:56 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c001bc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:46:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:57 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:46:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:57.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:46:57 np0005591762 python3.9[211306]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:58 np0005591762 python3.9[211459]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:58 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f580045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:46:58.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:58 np0005591762 python3.9[211612]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:58 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:59 np0005591762 python3.9[211768]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:46:59 np0005591762 podman[211771]: 2026-01-22 09:46:59.174160776 +0000 UTC m=+0.062226775 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:46:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:46:59 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:46:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:46:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:46:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:46:59.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:46:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:46:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:46:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:46:59 np0005591762 python3.9[211946]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:47:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:00 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:00 np0005591762 python3.9[212099]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:47:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:00.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:00 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:47:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:00 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:00 np0005591762 python3.9[212252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:47:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:01 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f6c003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:01 np0005591762 python3.9[212406]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:47:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:01.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:02 np0005591762 python3.9[212560]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:02 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c002c40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:02 np0005591762 python3.9[212712]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:02.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:02 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f600035e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:02 np0005591762 python3.9[212865]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:03 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:03.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:03 np0005591762 python3.9[213018]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:03 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:47:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:03 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:47:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:03 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:47:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:03 np0005591762 python3.9[213170]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:04 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:04 np0005591762 python3.9[213322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:04.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:04 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:47:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:04 np0005591762 python3.9[213474]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:04 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:05 np0005591762 python3.9[213627]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:05 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f600035e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:05.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:05 np0005591762 python3.9[213780]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:05 np0005591762 python3.9[213932]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f6c004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:06 np0005591762 python3.9[214084]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:06.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:06 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c002c40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:06 np0005591762 python3.9[214237]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:07 np0005591762 python3.9[214389]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:07 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:07.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:07 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:47:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:07 np0005591762 python3.9[214542]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:08 np0005591762 python3.9[214694]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:08 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f600045b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:08 np0005591762 python3.9[214846]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:08.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:08 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f6c004c80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:09 np0005591762 python3.9[215000]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:09 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c003950 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:09.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:09 np0005591762 python3.9[215152]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 04:47:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:10 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:10.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:10 np0005591762 python3.9[215304]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:47:10 np0005591762 systemd[1]: Reloading.
Jan 22 04:47:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:10 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:47:10 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:47:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:10 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f600045b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:11 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f600045b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:11 np0005591762 python3.9[215518]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:11.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:11 np0005591762 python3.9[215671]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:12 np0005591762 python3.9[215824]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:12 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f5c003950 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:12.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:12 np0005591762 python3.9[215977]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:12 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f4c0069a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094712 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:47:13 np0005591762 python3.9[216131]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[201180]: 22/01/2026 09:47:13 : epoch 6971f1de : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9f6c004c80 fd 47 proxy ignored for local
Jan 22 04:47:13 np0005591762 kernel: ganesha.nfsd[211663]: segfault at 50 ip 00007f9fdfdad32e sp 00007f9f47ff6210 error 4 in libntirpc.so.5.8[7f9fdfd92000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 22 04:47:13 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:47:13 np0005591762 systemd[1]: Started Process Core Dump (PID 216267/UID 0).
Jan 22 04:47:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:13.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:13 np0005591762 python3.9[216287]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:13 np0005591762 python3.9[216440]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:47:14 np0005591762 systemd-coredump[216284]: Process 201184 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 57:#012#0  0x00007f9fdfdad32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f9fdfdb7900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 22 04:47:14 np0005591762 python3.9[216593]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 04:47:14 np0005591762 systemd[1]: systemd-coredump@8-216267-0.service: Deactivated successfully.
Jan 22 04:47:14 np0005591762 systemd[1]: systemd-coredump@8-216267-0.service: Consumed 1.016s CPU time.
Jan 22 04:47:14 np0005591762 podman[216600]: 2026-01-22 09:47:14.426456323 +0000 UTC m=+0.020875327 container died 32be0d91595604d10891c5dd23870e4e9b4af24bc5029857cb075063908f7247 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:47:14 np0005591762 systemd[1]: var-lib-containers-storage-overlay-ad6dcd18f0b2a1a827720e15e00c0dfca0e6d7fac9098c2c79db6bc35b2756e5-merged.mount: Deactivated successfully.
Jan 22 04:47:14 np0005591762 podman[216600]: 2026-01-22 09:47:14.443565194 +0000 UTC m=+0.037984198 container remove 32be0d91595604d10891c5dd23870e4e9b4af24bc5029857cb075063908f7247 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 22 04:47:14 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:47:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:14.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:14 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:47:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:15.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:15 np0005591762 python3.9[216784]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:16 np0005591762 python3.9[216936]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:16 np0005591762 python3.9[217088]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:16.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:16 np0005591762 python3.9[217241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:17.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:17 np0005591762 python3.9[217394]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:17 np0005591762 python3.9[217546]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:18 np0005591762 python3.9[217698]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:18.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:18 np0005591762 python3.9[217851]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:19 np0005591762 python3.9[218003]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094719 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:47:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:19.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:19 np0005591762 podman[218128]: 2026-01-22 09:47:19.495965955 +0000 UTC m=+0.068407061 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:47:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:19 np0005591762 python3.9[218173]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:20.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:21.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:47:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:22.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:47:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:23.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:24 np0005591762 python3.9[218329]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 22 04:47:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:47:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:24.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:47:24 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 9.
Jan 22 04:47:24 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:47:24 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:24 np0005591762 podman[218523]: 2026-01-22 09:47:24.717000645 +0000 UTC m=+0.027562088 container create 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:47:24 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5e36e4297e1ebc9482605461bf93afa51fd40a5a28ed4d1fc4f8ba22fc0adf0/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:24 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5e36e4297e1ebc9482605461bf93afa51fd40a5a28ed4d1fc4f8ba22fc0adf0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:24 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5e36e4297e1ebc9482605461bf93afa51fd40a5a28ed4d1fc4f8ba22fc0adf0/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:24 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5e36e4297e1ebc9482605461bf93afa51fd40a5a28ed4d1fc4f8ba22fc0adf0/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:24 np0005591762 podman[218523]: 2026-01-22 09:47:24.761018703 +0000 UTC m=+0.071580166 container init 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 22 04:47:24 np0005591762 podman[218523]: 2026-01-22 09:47:24.765333032 +0000 UTC m=+0.075894474 container start 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:47:24 np0005591762 bash[218523]: 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40
Jan 22 04:47:24 np0005591762 podman[218523]: 2026-01-22 09:47:24.705975991 +0000 UTC m=+0.016537444 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:47:24 np0005591762 python3.9[218498]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 04:47:24 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:47:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:47:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:25.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:25 np0005591762 python3.9[218735]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 04:47:25 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:47:25 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:47:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:26 np0005591762 systemd-logind[744]: New session 53 of user zuul.
Jan 22 04:47:26 np0005591762 systemd[1]: Started Session 53 of User zuul.
Jan 22 04:47:26 np0005591762 systemd[1]: session-53.scope: Deactivated successfully.
Jan 22 04:47:26 np0005591762 systemd-logind[744]: Session 53 logged out. Waiting for processes to exit.
Jan 22 04:47:26 np0005591762 systemd-logind[744]: Removed session 53.
Jan 22 04:47:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:26.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:27 np0005591762 python3.9[218923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:27.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:27 np0005591762 python3.9[219045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075246.7337904-2657-178610569903822/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:27 np0005591762 python3.9[219195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:28 np0005591762 python3.9[219271]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:28.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:28 np0005591762 python3.9[219421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:28 np0005591762 python3.9[219543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075248.2650967-2657-126056904713501/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:29 np0005591762 podman[219668]: 2026-01-22 09:47:29.324567284 +0000 UTC m=+0.057492954 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 04:47:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:29.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:29 np0005591762 python3.9[219704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:29 np0005591762 python3.9[219838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075249.081287-2657-258546711546079/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:30 np0005591762 python3.9[219988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:30.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:30 np0005591762 python3.9[220110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075250.049547-2657-188986927148647/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:30 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:47:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:30 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:47:31 np0005591762 python3.9[220284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:31.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:31 np0005591762 python3.9[220407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075250.8830616-2657-78712876095812/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:32 np0005591762 python3.9[220559]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:32.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:32 np0005591762 python3.9[220712]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:33.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:33 np0005591762 python3.9[220865]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:47:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:33 np0005591762 python3.9[221090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:33 np0005591762 podman[221126]: 2026-01-22 09:47:33.915599484 +0000 UTC m=+0.038341832 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 22 04:47:34 np0005591762 podman[221190]: 2026-01-22 09:47:34.049486473 +0000 UTC m=+0.048108574 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 22 04:47:34 np0005591762 podman[221126]: 2026-01-22 09:47:34.052247181 +0000 UTC m=+0.174989520 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 22 04:47:34 np0005591762 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 22 04:47:34 np0005591762 python3.9[221296]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769075253.5611322-2979-24053392439049/.source _original_basename=.55hl1heg follow=False checksum=4c68cadc70785cdae4128fdb027197ff8c3652b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 22 04:47:34 np0005591762 podman[221365]: 2026-01-22 09:47:34.406041663 +0000 UTC m=+0.034933795 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:47:34 np0005591762 podman[221365]: 2026-01-22 09:47:34.414595688 +0000 UTC m=+0.043487799 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:47:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:34.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:34 np0005591762 podman[221458]: 2026-01-22 09:47:34.606584169 +0000 UTC m=+0.033225202 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:47:34 np0005591762 podman[221458]: 2026-01-22 09:47:34.616669052 +0000 UTC m=+0.043310094 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:47:34 np0005591762 podman[221547]: 2026-01-22 09:47:34.758324604 +0000 UTC m=+0.035787565 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64)
Jan 22 04:47:34 np0005591762 podman[221547]: 2026-01-22 09:47:34.76555376 +0000 UTC m=+0.043016720 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, architecture=x86_64, version=2.2.4, name=keepalived, distribution-scope=public, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 04:47:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094734 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:47:34 np0005591762 podman[221649]: 2026-01-22 09:47:34.875839893 +0000 UTC m=+0.034844946 container exec 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:47:34 np0005591762 podman[221649]: 2026-01-22 09:47:34.885607136 +0000 UTC m=+0.044612190 container exec_died 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Jan 22 04:47:35 np0005591762 python3.9[221692]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:47:35 np0005591762 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 04:47:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:47:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:35.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:47:35 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:35 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:35 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:35 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:35 np0005591762 python3.9[221947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:35 np0005591762 python3.9[222084]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075255.1994848-3056-159147414280381/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:36 np0005591762 python3.9[222234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 04:47:36 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:47:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:36 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:47:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:36.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:36 np0005591762 python3.9[222356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769075256.0860584-3101-101333260164963/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:47:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:47:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:47:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:37 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a9c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:37.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:37 np0005591762 ceph-mon[75519]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 22 04:47:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:37 np0005591762 python3.9[222523]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 22 04:47:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:38 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2bb80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:38 np0005591762 python3.9[222675]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 04:47:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:38.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:38 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094739 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:47:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:39 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:39.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:39 np0005591762 python3[222854]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 04:47:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:40 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:40.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:40 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2bb80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:41 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2bb80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:41.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:42 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2bb80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:42 np0005591762 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 22 04:47:42 np0005591762 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 22 04:47:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:42.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:42 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:47:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:42 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:43 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90002de0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:43.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:44 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90002de0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:47:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:44.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:47:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:44 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:45 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:47:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:45.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:47:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:45 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:47:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:45 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:47:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:46 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:46.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:46 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:47:47.189 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:47:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:47:47.189 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:47:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:47:47.189 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:47:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:47 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:47.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:48 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:48.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:48 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:47:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:48 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:49 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:49.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:50 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:50 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:47:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:50.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:50 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a940039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:51 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900055c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:51.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:51 np0005591762 podman[222927]: 2026-01-22 09:47:51.511591797 +0000 UTC m=+1.734751183 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 04:47:51 np0005591762 podman[222865]: 2026-01-22 09:47:51.541462559 +0000 UTC m=+11.901600069 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 22 04:47:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:51 np0005591762 podman[222987]: 2026-01-22 09:47:51.640154233 +0000 UTC m=+0.029647723 container create 8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 22 04:47:51 np0005591762 podman[222987]: 2026-01-22 09:47:51.624653986 +0000 UTC m=+0.014147486 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 22 04:47:51 np0005591762 python3[222854]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 22 04:47:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:52 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900055c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:52.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:52 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:53 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a940039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:53.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:53 np0005591762 python3.9[223169]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:47:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:54 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900055c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:54 np0005591762 python3.9[223323]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 22 04:47:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:54.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:54 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900055c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094754 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:47:54 np0005591762 python3.9[223476]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 04:47:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:55 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:55.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:55 np0005591762 python3[223629]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 04:47:55 np0005591762 podman[223658]: 2026-01-22 09:47:55.983093477 +0000 UTC m=+0.029192474 container create 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:47:55 np0005591762 podman[223658]: 2026-01-22 09:47:55.969698041 +0000 UTC m=+0.015797048 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 22 04:47:55 np0005591762 python3[223629]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 22 04:47:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:56 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a940039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:56 np0005591762 python3.9[223836]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:47:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:56.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:56 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a940039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:47:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:57 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:57.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:57 np0005591762 python3.9[223992]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:58 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:58 np0005591762 python3.9[224143]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769075277.9555721-3389-38838728473571/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 04:47:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:47:58.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:58 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:58 np0005591762 python3.9[224220]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 04:47:58 np0005591762 systemd[1]: Reloading.
Jan 22 04:47:58 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:47:58 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:47:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:47:59 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a940039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:47:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:47:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:47:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:47:59.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:47:59 np0005591762 python3.9[224332]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 04:47:59 np0005591762 systemd[1]: Reloading.
Jan 22 04:47:59 np0005591762 podman[224334]: 2026-01-22 09:47:59.60397715 +0000 UTC m=+0.070110314 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:47:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:47:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:47:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:47:59 np0005591762 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 04:47:59 np0005591762 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 04:47:59 np0005591762 systemd[1]: Starting nova_compute container...
Jan 22 04:47:59 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:47:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:59 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 04:47:59 np0005591762 podman[224395]: 2026-01-22 09:47:59.899073244 +0000 UTC m=+0.062353743 container init 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Jan 22 04:47:59 np0005591762 podman[224395]: 2026-01-22 09:47:59.9062493 +0000 UTC m=+0.069529799 container start 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2)
Jan 22 04:47:59 np0005591762 podman[224395]: nova_compute
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + sudo -E kolla_set_configs
Jan 22 04:47:59 np0005591762 systemd[1]: Started nova_compute container.
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Validating config file
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying service configuration files
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Deleting /etc/ceph
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Creating directory /etc/ceph
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Writing out command to execute
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:47:59 np0005591762 nova_compute[224407]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 04:47:59 np0005591762 nova_compute[224407]: ++ cat /run_command
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + CMD=nova-compute
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + ARGS=
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + sudo kolla_copy_cacerts
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + [[ ! -n '' ]]
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + . kolla_extend_start
Jan 22 04:47:59 np0005591762 nova_compute[224407]: Running command: 'nova-compute'
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + umask 0022
Jan 22 04:47:59 np0005591762 nova_compute[224407]: + exec nova-compute
Jan 22 04:48:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:00 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:00.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:00 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:00 np0005591762 python3.9[224569]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:48:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:01 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:48:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:01.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:48:01 np0005591762 python3.9[224721]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:48:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.777 224411 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.778 224411 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.778 224411 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.778 224411 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 22 04:48:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.897 224411 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.906 224411 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:48:01 np0005591762 nova_compute[224407]: 2026-01-22 09:48:01.906 224411 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 04:48:01 np0005591762 python3.9[224873]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 04:48:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:02 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94004ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.485 224411 INFO nova.virt.driver [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 22 04:48:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:02.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.570 224411 INFO nova.compute.provider_config [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.583 224411 DEBUG oslo_concurrency.lockutils [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.584 224411 DEBUG oslo_concurrency.lockutils [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.584 224411 DEBUG oslo_concurrency.lockutils [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.585 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.586 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.587 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.588 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.589 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.590 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.590 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.590 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.590 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.590 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.590 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.591 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.592 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.593 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.594 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.595 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.595 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.595 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.595 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.595 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.595 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.596 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.597 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.598 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.599 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.600 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.601 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.602 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.603 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.604 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.605 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.606 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.607 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.608 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.609 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.610 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.610 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.610 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.610 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.610 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.610 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.611 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.612 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.613 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.614 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.615 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.616 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.617 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.618 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.619 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.620 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.621 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.622 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.622 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.622 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.622 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.622 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.622 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.623 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.624 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.625 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.626 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.627 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.628 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.629 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.630 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.631 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.632 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.633 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.633 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.633 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.633 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.633 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.633 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.634 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.635 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.635 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.635 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.635 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.635 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.635 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.636 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.637 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.638 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.639 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.640 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.640 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.640 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.640 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.640 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.640 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.641 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.642 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.643 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.644 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.645 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.646 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.647 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.648 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.648 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.648 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.648 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.648 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.648 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.649 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.649 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.649 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.649 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.649 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.649 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.650 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.651 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.652 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.653 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.654 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.655 224411 WARNING oslo_config.cfg [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 04:48:02 np0005591762 nova_compute[224407]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 04:48:02 np0005591762 nova_compute[224407]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 04:48:02 np0005591762 nova_compute[224407]: and ``live_migration_inbound_addr`` respectively.
Jan 22 04:48:02 np0005591762 nova_compute[224407]: ).  Its value may be silently ignored in the future.#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.655 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.655 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.655 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.655 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.656 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.657 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.657 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.657 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.657 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.657 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.657 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rbd_secret_uuid        = 43df7a30-cf5f-5209-adfd-bf44298b19f2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.658 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.659 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.659 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.659 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.659 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.659 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.659 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.660 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.661 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.662 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.662 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.662 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.662 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.662 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.662 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.663 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.664 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.665 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.666 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.667 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.668 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.669 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.670 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.671 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.672 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.673 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.674 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.675 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.675 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.675 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.675 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.675 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.675 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.676 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.677 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.677 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.677 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.677 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.677 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.677 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.678 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.679 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.680 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.681 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.681 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.681 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.681 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.681 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.681 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.682 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.683 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.683 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.683 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.683 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.683 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.683 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.684 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.685 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.686 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.687 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.688 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.689 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.690 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.691 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.691 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.691 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.691 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.691 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.691 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.692 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.693 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.694 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.695 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.695 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.695 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.695 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.695 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.695 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.696 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.697 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.698 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.699 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.699 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.699 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.699 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.699 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.700 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.701 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.702 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.703 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.704 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.705 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.706 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.707 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.708 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.709 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.710 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.711 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.712 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.713 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.714 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.715 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.716 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.716 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.716 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.716 224411 DEBUG oslo_service.service [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.717 224411 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 22 04:48:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:02 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.728 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.728 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.728 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.729 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 22 04:48:02 np0005591762 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 04:48:02 np0005591762 systemd[1]: Started libvirt QEMU daemon.
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.778 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f19905fda90> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.781 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f19905fda90> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.781 224411 INFO nova.virt.libvirt.driver [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.800 224411 WARNING nova.virt.libvirt.driver [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 22 04:48:02 np0005591762 nova_compute[224407]: 2026-01-22 09:48:02.801 224411 DEBUG nova.virt.libvirt.volume.mount [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 22 04:48:02 np0005591762 python3.9[225028]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 04:48:02 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:48:02 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:48:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:03 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:03.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.489 224411 INFO nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <host>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <uuid>5cdfdaef-d5ed-40c6-865a-abf2be70f95e</uuid>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <arch>x86_64</arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model>EPYC-Milan-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <vendor>AMD</vendor>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <microcode version='167776725'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <signature family='25' model='1' stepping='1'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <maxphysaddr mode='emulate' bits='48'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='x2apic'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='tsc-deadline'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='osxsave'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='hypervisor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='tsc_adjust'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='ospke'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='vaes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='vpclmulqdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='spec-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='stibp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='arch-capabilities'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='cmp_legacy'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='virt-ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='lbrv'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='tsc-scale'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='vmcb-clean'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='pause-filter'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='pfthreshold'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='v-vmsave-vmload'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='vgif'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='rdctl-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='skip-l1dfl-vmentry'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='mds-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature name='pschange-mc-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <pages unit='KiB' size='4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <pages unit='KiB' size='2048'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <pages unit='KiB' size='1048576'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <power_management>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <suspend_mem/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </power_management>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <iommu support='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <migration_features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <live/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <uri_transports>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <uri_transport>tcp</uri_transport>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <uri_transport>rdma</uri_transport>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </uri_transports>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </migration_features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <topology>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <cells num='1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <cell id='0'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          <memory unit='KiB'>7865364</memory>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          <pages unit='KiB' size='4'>1966341</pages>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          <pages unit='KiB' size='2048'>0</pages>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          <distances>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:            <sibling id='0' value='10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          </distances>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          <cpus num='4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:          </cpus>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        </cell>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </cells>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </topology>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <cache>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </cache>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <secmodel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model>selinux</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <doi>0</doi>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </secmodel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <secmodel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model>dac</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <doi>0</doi>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </secmodel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </host>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <guest>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <os_type>hvm</os_type>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <arch name='i686'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <wordsize>32</wordsize>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <domain type='qemu'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <domain type='kvm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <pae/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <nonpae/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <acpi default='on' toggle='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <apic default='on' toggle='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <cpuselection/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <deviceboot/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <disksnapshot default='on' toggle='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <externalSnapshot/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </guest>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <guest>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <os_type>hvm</os_type>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <arch name='x86_64'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <wordsize>64</wordsize>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <domain type='qemu'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <domain type='kvm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <acpi default='on' toggle='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <apic default='on' toggle='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <cpuselection/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <deviceboot/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <disksnapshot default='on' toggle='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <externalSnapshot/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </guest>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 
Jan 22 04:48:03 np0005591762 nova_compute[224407]: </capabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: #033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.494 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.512 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 04:48:03 np0005591762 nova_compute[224407]: <domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <domain>kvm</domain>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <arch>i686</arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <vcpu max='240'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <iothreads supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <os supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='firmware'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <loader supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>rom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pflash</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='readonly'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>yes</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='secure'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </loader>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </os>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='maximumMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <vendor>AMD</vendor>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='succor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='custom' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 python3.9[225259]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <memoryBacking supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='sourceType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>anonymous</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>memfd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </memoryBacking>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <disk supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='diskDevice'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>disk</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cdrom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>floppy</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>lun</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ide</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>fdc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>sata</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </disk>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <graphics supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vnc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egl-headless</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </graphics>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <video supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='modelType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vga</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cirrus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>none</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>bochs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ramfb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </video>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hostdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='mode'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>subsystem</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='startupPolicy'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>mandatory</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>requisite</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>optional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='subsysType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pci</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='capsType'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='pciBackend'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hostdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <rng supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>random</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </rng>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <filesystem supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='driverType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>path</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>handle</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtiofs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </filesystem>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tpm supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-tis</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-crb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emulator</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>external</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendVersion'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>2.0</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </tpm>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <redirdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </redirdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <channel supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </channel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <crypto supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </crypto>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <interface supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>passt</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </interface>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <panic supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>isa</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>hyperv</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </panic>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <console supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>null</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dev</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pipe</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stdio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>udp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tcp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu-vdagent</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </console>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <gic supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <genid supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backup supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <async-teardown supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <s390-pv supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <ps2 supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tdx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sev supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sgx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hyperv supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='features'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>relaxed</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vapic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>spinlocks</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vpindex</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>runtime</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>synic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stimer</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reset</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vendor_id</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>frequencies</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reenlightenment</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tlbflush</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ipi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>avic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emsr_bitmap</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>xmm_input</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hyperv>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <launchSecurity supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: </domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.516 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 04:48:03 np0005591762 nova_compute[224407]: <domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <domain>kvm</domain>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <arch>i686</arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <vcpu max='4096'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <iothreads supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <os supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='firmware'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <loader supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>rom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pflash</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='readonly'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>yes</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='secure'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </loader>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </os>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='maximumMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <vendor>AMD</vendor>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='succor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='custom' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 systemd[1]: Stopping nova_compute container...
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <memoryBacking supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='sourceType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>anonymous</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>memfd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </memoryBacking>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <disk supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='diskDevice'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>disk</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cdrom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>floppy</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>lun</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>fdc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>sata</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </disk>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <graphics supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vnc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egl-headless</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </graphics>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <video supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='modelType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vga</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cirrus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>none</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>bochs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ramfb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </video>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hostdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='mode'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>subsystem</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='startupPolicy'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>mandatory</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>requisite</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>optional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='subsysType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pci</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='capsType'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='pciBackend'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hostdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <rng supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>random</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </rng>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <filesystem supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='driverType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>path</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>handle</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtiofs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </filesystem>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tpm supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-tis</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-crb</value>
Jan 22 04:48:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emulator</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>external</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendVersion'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>2.0</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </tpm>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <redirdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </redirdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <channel supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </channel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <crypto supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </crypto>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <interface supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>passt</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </interface>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <panic supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>isa</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>hyperv</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </panic>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <console supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>null</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dev</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pipe</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stdio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>udp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tcp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu-vdagent</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </console>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <gic supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <genid supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backup supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <async-teardown supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <s390-pv supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <ps2 supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tdx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sev supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sgx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hyperv supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='features'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>relaxed</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vapic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>spinlocks</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vpindex</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>runtime</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>synic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stimer</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reset</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vendor_id</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>frequencies</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reenlightenment</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tlbflush</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ipi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>avic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emsr_bitmap</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>xmm_input</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hyperv>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <launchSecurity supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: </domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.547 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.550 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 04:48:03 np0005591762 nova_compute[224407]: <domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <domain>kvm</domain>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <arch>x86_64</arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <vcpu max='240'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <iothreads supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <os supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='firmware'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <loader supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>rom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pflash</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='readonly'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>yes</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='secure'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </loader>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </os>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='maximumMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <vendor>AMD</vendor>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='succor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='custom' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <memoryBacking supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='sourceType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>anonymous</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>memfd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </memoryBacking>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <disk supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='diskDevice'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>disk</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cdrom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>floppy</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>lun</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ide</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>fdc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>sata</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </disk>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <graphics supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vnc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egl-headless</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </graphics>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <video supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='modelType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vga</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cirrus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>none</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>bochs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ramfb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </video>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hostdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='mode'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>subsystem</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='startupPolicy'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>mandatory</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>requisite</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>optional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='subsysType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pci</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='capsType'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='pciBackend'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hostdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <rng supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>random</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </rng>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <filesystem supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='driverType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>path</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>handle</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtiofs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </filesystem>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tpm supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-tis</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-crb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emulator</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>external</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendVersion'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>2.0</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </tpm>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <redirdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </redirdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <channel supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </channel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <crypto supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </crypto>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <interface supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>passt</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </interface>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <panic supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>isa</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>hyperv</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </panic>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <console supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>null</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dev</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pipe</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stdio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>udp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tcp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu-vdagent</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </console>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <gic supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <genid supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backup supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <async-teardown supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <s390-pv supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <ps2 supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tdx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sev supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sgx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hyperv supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='features'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>relaxed</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vapic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>spinlocks</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vpindex</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>runtime</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>synic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stimer</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reset</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vendor_id</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>frequencies</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reenlightenment</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tlbflush</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ipi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>avic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emsr_bitmap</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>xmm_input</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hyperv>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <launchSecurity supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: </domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.602 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 04:48:03 np0005591762 nova_compute[224407]: <domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <domain>kvm</domain>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <arch>x86_64</arch>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <vcpu max='4096'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <iothreads supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <os supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='firmware'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>efi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <loader supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>rom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pflash</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='readonly'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>yes</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='secure'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>yes</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>no</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </loader>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </os>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='maximumMigratable'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>on</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>off</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <vendor>AMD</vendor>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='succor'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <mode name='custom' supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ddpd-u'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sha512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm3'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sm4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Denverton-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amd-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='auto-ibrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='perfmon-v2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbpb'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='stibp-always-on'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-128'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-256'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx10-512'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='prefetchiti'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Haswell-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512er'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512pf'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fma4'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tbm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xop'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='amx-tile'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-bf16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-fp16'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bitalg'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrc'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fzrm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='la57'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='taa-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='xfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ifma'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cmpccxadd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fbsdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='fsrs'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ibrs-all'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='intel-psfd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='lam'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mcdt-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='pbrsb-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='psdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='serialize'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='hle'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='rtm'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512bw'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512cd'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512dq'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512f'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='avx512vl'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='mpx'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='core-capability'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='split-lock-detect'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='cldemote'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='gfni'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdir64b'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='movdiri'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='athlon-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='core2duo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='coreduo-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='n270-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='ss'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <blockers model='phenom-v1'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnow'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <feature name='3dnowext'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </blockers>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </mode>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </cpu>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <memoryBacking supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <enum name='sourceType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>anonymous</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <value>memfd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </memoryBacking>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <disk supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='diskDevice'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>disk</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cdrom</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>floppy</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>lun</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>fdc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>sata</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </disk>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <graphics supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vnc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egl-headless</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </graphics>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <video supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='modelType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vga</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>cirrus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>none</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>bochs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ramfb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </video>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hostdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='mode'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>subsystem</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='startupPolicy'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>mandatory</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>requisite</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>optional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='subsysType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pci</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>scsi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='capsType'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='pciBackend'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hostdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <rng supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtio-non-transitional</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>random</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>egd</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </rng>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <filesystem supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='driverType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>path</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>handle</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>virtiofs</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </filesystem>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tpm supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-tis</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tpm-crb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emulator</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>external</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendVersion'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>2.0</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </tpm>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <redirdev supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='bus'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>usb</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </redirdev>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <channel supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </channel>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <crypto supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendModel'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>builtin</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </crypto>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <interface supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='backendType'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>default</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>passt</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </interface>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <panic supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='model'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>isa</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>hyperv</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </panic>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <console supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='type'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>null</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vc</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pty</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dev</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>file</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>pipe</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stdio</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>udp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tcp</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>unix</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>qemu-vdagent</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>dbus</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </console>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </devices>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  <features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <gic supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <genid supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <backup supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <async-teardown supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <s390-pv supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <ps2 supported='yes'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <tdx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sev supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <sgx supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <hyperv supported='yes'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <enum name='features'>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>relaxed</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vapic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>spinlocks</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vpindex</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>runtime</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>synic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>stimer</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reset</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>vendor_id</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>frequencies</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>reenlightenment</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>tlbflush</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>ipi</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>avic</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>emsr_bitmap</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <value>xmm_input</value>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </enum>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      <defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:      </defaults>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    </hyperv>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:    <launchSecurity supported='no'/>
Jan 22 04:48:03 np0005591762 nova_compute[224407]:  </features>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: </domainCapabilities>
Jan 22 04:48:03 np0005591762 nova_compute[224407]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.658 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.658 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.658 224411 DEBUG nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.658 224411 INFO nova.virt.libvirt.host [None req-64251ec7-d154-4897-b6c2-c9048b177b8b - - - - - -] Secure Boot support detected#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.659 224411 DEBUG oslo_concurrency.lockutils [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.659 224411 DEBUG oslo_concurrency.lockutils [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:48:03 np0005591762 nova_compute[224407]: 2026-01-22 09:48:03.659 224411 DEBUG oslo_concurrency.lockutils [None req-2889367f-15ce-414a-84de-094553dbfa54 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:48:03 np0005591762 virtqemud[225050]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 04:48:03 np0005591762 virtqemud[225050]: hostname: compute-2
Jan 22 04:48:03 np0005591762 virtqemud[225050]: End of file while reading data: Input/output error
Jan 22 04:48:03 np0005591762 systemd[1]: libpod-0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51.scope: Deactivated successfully.
Jan 22 04:48:03 np0005591762 systemd[1]: libpod-0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51.scope: Consumed 2.397s CPU time.
Jan 22 04:48:03 np0005591762 podman[225267]: 2026-01-22 09:48:03.88685737 +0000 UTC m=+0.310243797 container died 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 22 04:48:03 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51-userdata-shm.mount: Deactivated successfully.
Jan 22 04:48:03 np0005591762 systemd[1]: var-lib-containers-storage-overlay-57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4-merged.mount: Deactivated successfully.
Jan 22 04:48:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:04 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:04.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:04 np0005591762 podman[225267]: 2026-01-22 09:48:04.704491883 +0000 UTC m=+1.127878310 container cleanup 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:48:04 np0005591762 podman[225267]: nova_compute
Jan 22 04:48:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:04 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94004ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:04 np0005591762 podman[225291]: nova_compute
Jan 22 04:48:04 np0005591762 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 22 04:48:04 np0005591762 systemd[1]: Stopped nova_compute container.
Jan 22 04:48:04 np0005591762 systemd[1]: Starting nova_compute container...
Jan 22 04:48:04 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:48:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:04 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57ece8a907f8daa8d4a870a1cf259ac1f77b4594d7fac0b1109a4ae593bafff4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:04 np0005591762 podman[225300]: 2026-01-22 09:48:04.859941294 +0000 UTC m=+0.081270895 container init 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Jan 22 04:48:04 np0005591762 podman[225300]: 2026-01-22 09:48:04.865242683 +0000 UTC m=+0.086572285 container start 0df0cea4cda32d17323364e466d6f0defc53ebdbf99b4e0ccba1e8d300d26c51 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute)
Jan 22 04:48:04 np0005591762 podman[225300]: nova_compute
Jan 22 04:48:04 np0005591762 systemd[1]: Started nova_compute container.
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + sudo -E kolla_set_configs
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Validating config file
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying service configuration files
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /etc/ceph
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Creating directory /etc/ceph
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Writing out command to execute
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:48:04 np0005591762 nova_compute[225313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 04:48:04 np0005591762 nova_compute[225313]: ++ cat /run_command
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + CMD=nova-compute
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + ARGS=
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + sudo kolla_copy_cacerts
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + [[ ! -n '' ]]
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + . kolla_extend_start
Jan 22 04:48:04 np0005591762 nova_compute[225313]: Running command: 'nova-compute'
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + umask 0022
Jan 22 04:48:04 np0005591762 nova_compute[225313]: + exec nova-compute
Jan 22 04:48:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:05 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:05.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:05 np0005591762 python3.9[225477]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 04:48:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:05 np0005591762 systemd[1]: Started libpod-conmon-8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f.scope.
Jan 22 04:48:05 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:48:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9123231b41c32d6da4ce1b60861e60a1da3eff04c23479eab7a303b42e52e666/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9123231b41c32d6da4ce1b60861e60a1da3eff04c23479eab7a303b42e52e666/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:05 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9123231b41c32d6da4ce1b60861e60a1da3eff04c23479eab7a303b42e52e666/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 22 04:48:05 np0005591762 podman[225498]: 2026-01-22 09:48:05.684778213 +0000 UTC m=+0.085108393 container init 8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 04:48:05 np0005591762 podman[225498]: 2026-01-22 09:48:05.690195652 +0000 UTC m=+0.090525811 container start 8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:48:05 np0005591762 python3.9[225477]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Applying nova statedir ownership
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 22 04:48:05 np0005591762 nova_compute_init[225516]: INFO:nova_statedir:Nova statedir ownership complete
Jan 22 04:48:05 np0005591762 systemd[1]: libpod-8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f.scope: Deactivated successfully.
Jan 22 04:48:05 np0005591762 podman[225527]: 2026-01-22 09:48:05.769630102 +0000 UTC m=+0.021376822 container died 8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:48:05 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f-userdata-shm.mount: Deactivated successfully.
Jan 22 04:48:05 np0005591762 systemd[1]: var-lib-containers-storage-overlay-9123231b41c32d6da4ce1b60861e60a1da3eff04c23479eab7a303b42e52e666-merged.mount: Deactivated successfully.
Jan 22 04:48:05 np0005591762 podman[225527]: 2026-01-22 09:48:05.814672421 +0000 UTC m=+0.066419131 container cleanup 8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251202)
Jan 22 04:48:05 np0005591762 systemd[1]: libpod-conmon-8d4cdf1614315274d2f855f5ba6d19c4c1366b5b229fb28c7e64c93a031ae30f.scope: Deactivated successfully.
Jan 22 04:48:06 np0005591762 systemd-logind[744]: Session 52 logged out. Waiting for processes to exit.
Jan 22 04:48:06 np0005591762 systemd[1]: session-52.scope: Deactivated successfully.
Jan 22 04:48:06 np0005591762 systemd[1]: session-52.scope: Consumed 1min 27.904s CPU time.
Jan 22 04:48:06 np0005591762 systemd-logind[744]: Removed session 52.
Jan 22 04:48:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:06 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:06.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.592 225317 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.593 225317 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.593 225317 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.593 225317 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 22 04:48:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.704 225317 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.714 225317 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:48:06 np0005591762 nova_compute[225313]: 2026-01-22 09:48:06.714 225317 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 04:48:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:06 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.088 225317 INFO nova.virt.driver [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.168 225317 INFO nova.compute.provider_config [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.177 225317 DEBUG oslo_concurrency.lockutils [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.177 225317 DEBUG oslo_concurrency.lockutils [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.177 225317 DEBUG oslo_concurrency.lockutils [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.177 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.178 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.179 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.180 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.181 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.182 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.183 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.183 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.183 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.183 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.183 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.183 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.184 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.185 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.186 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.187 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.188 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.189 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.190 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.191 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.192 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.193 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.194 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.195 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.196 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.197 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.198 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.199 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.200 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.201 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.202 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.203 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.204 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.205 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.206 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.207 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.208 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.209 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.210 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.211 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.212 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.213 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.214 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.215 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.216 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.217 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.218 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.219 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.220 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.221 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.222 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.223 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.224 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.225 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.226 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.227 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.227 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.227 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.227 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.227 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.227 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.228 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.229 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.230 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.231 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.231 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.231 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.231 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.231 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.231 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.232 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.233 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.234 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.235 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.236 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.237 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.238 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.239 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.240 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.241 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.242 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.243 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.244 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.245 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.245 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.245 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.245 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.245 225317 WARNING oslo_config.cfg [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 04:48:07 np0005591762 nova_compute[225313]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 04:48:07 np0005591762 nova_compute[225313]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 04:48:07 np0005591762 nova_compute[225313]: and ``live_migration_inbound_addr`` respectively.
Jan 22 04:48:07 np0005591762 nova_compute[225313]: ).  Its value may be silently ignored in the future.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.245 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.246 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.247 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rbd_secret_uuid        = 43df7a30-cf5f-5209-adfd-bf44298b19f2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.248 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.249 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.250 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.251 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.251 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.251 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.251 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.251 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.251 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.252 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.253 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.254 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.255 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.255 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.255 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.255 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.255 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.255 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.256 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.257 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.258 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.258 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.258 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.258 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.258 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.258 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.259 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.260 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.261 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.262 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.263 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.264 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.265 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.266 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.267 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.268 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.268 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.268 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.268 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.268 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.268 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.269 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.270 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.271 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.272 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.272 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.272 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.272 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.272 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.273 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.273 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.273 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.273 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.273 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.273 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.274 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.274 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.274 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.274 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.274 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.274 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.275 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.276 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.277 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.278 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.279 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.280 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.281 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.281 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.281 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.281 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.281 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.281 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.282 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.282 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.283 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.283 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.283 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.283 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.284 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.285 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.286 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.287 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.288 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.289 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.290 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.291 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.292 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.293 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.294 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.295 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.295 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.295 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.295 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.295 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.295 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.296 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.297 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.298 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.299 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.300 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.301 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.302 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.303 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.304 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.305 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.306 225317 DEBUG oslo_service.service [None req-7cda5ba1-8720-444f-bd9d-66c05bb41717 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 04:48:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:07 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a94004ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.307 225317 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.317 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.318 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.318 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.318 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.330 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4d7f766490> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.332 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4d7f766490> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.333 225317 INFO nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.336 225317 INFO nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <host>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <uuid>5cdfdaef-d5ed-40c6-865a-abf2be70f95e</uuid>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <arch>x86_64</arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model>EPYC-Milan-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <vendor>AMD</vendor>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <microcode version='167776725'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <signature family='25' model='1' stepping='1'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <maxphysaddr mode='emulate' bits='48'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='x2apic'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='tsc-deadline'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='osxsave'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='hypervisor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='tsc_adjust'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='ospke'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='vaes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='vpclmulqdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='spec-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='stibp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='arch-capabilities'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='cmp_legacy'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='virt-ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='lbrv'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='tsc-scale'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='vmcb-clean'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='pause-filter'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='pfthreshold'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='v-vmsave-vmload'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='vgif'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='rdctl-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='skip-l1dfl-vmentry'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='mds-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature name='pschange-mc-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <pages unit='KiB' size='4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <pages unit='KiB' size='2048'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <pages unit='KiB' size='1048576'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <power_management>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <suspend_mem/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </power_management>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <iommu support='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <migration_features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <live/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <uri_transports>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <uri_transport>tcp</uri_transport>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <uri_transport>rdma</uri_transport>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </uri_transports>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </migration_features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <topology>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <cells num='1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <cell id='0'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          <memory unit='KiB'>7865364</memory>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          <pages unit='KiB' size='4'>1966341</pages>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          <pages unit='KiB' size='2048'>0</pages>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          <distances>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:            <sibling id='0' value='10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          </distances>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          <cpus num='4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:          </cpus>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        </cell>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </cells>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </topology>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <cache>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </cache>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <secmodel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model>selinux</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <doi>0</doi>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </secmodel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <secmodel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model>dac</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <doi>0</doi>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </secmodel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </host>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <guest>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <os_type>hvm</os_type>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <arch name='i686'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <wordsize>32</wordsize>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <domain type='qemu'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <domain type='kvm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <pae/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <nonpae/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <acpi default='on' toggle='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <apic default='on' toggle='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <cpuselection/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <deviceboot/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <disksnapshot default='on' toggle='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <externalSnapshot/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </guest>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <guest>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <os_type>hvm</os_type>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <arch name='x86_64'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <wordsize>64</wordsize>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <domain type='qemu'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <domain type='kvm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <acpi default='on' toggle='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <apic default='on' toggle='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <cpuselection/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <deviceboot/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <disksnapshot default='on' toggle='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <externalSnapshot/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </guest>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 
Jan 22 04:48:07 np0005591762 nova_compute[225313]: </capabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: #033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.342 225317 WARNING nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.342 225317 DEBUG nova.virt.libvirt.volume.mount [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.343 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.347 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 04:48:07 np0005591762 nova_compute[225313]: <domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <domain>kvm</domain>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <arch>i686</arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <vcpu max='240'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <iothreads supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <os supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='firmware'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <loader supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>rom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pflash</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='readonly'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>yes</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='secure'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </loader>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='maximumMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <vendor>AMD</vendor>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='succor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='custom' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <memoryBacking supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='sourceType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>anonymous</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>memfd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </memoryBacking>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <disk supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='diskDevice'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>disk</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cdrom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>floppy</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>lun</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ide</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>fdc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>sata</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <graphics supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vnc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egl-headless</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <video supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='modelType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vga</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cirrus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>none</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>bochs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ramfb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hostdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='mode'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>subsystem</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='startupPolicy'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>mandatory</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>requisite</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>optional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='subsysType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pci</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='capsType'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='pciBackend'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hostdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <rng supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>random</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <filesystem supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='driverType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>path</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>handle</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtiofs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </filesystem>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tpm supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-tis</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-crb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emulator</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>external</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendVersion'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>2.0</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </tpm>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <redirdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </redirdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <channel supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </channel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <crypto supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </crypto>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <interface supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>passt</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <panic supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>isa</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>hyperv</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </panic>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <console supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>null</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dev</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pipe</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stdio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>udp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tcp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu-vdagent</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <gic supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <genid supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backup supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <async-teardown supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <s390-pv supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <ps2 supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tdx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sev supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sgx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hyperv supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='features'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>relaxed</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vapic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>spinlocks</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vpindex</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>runtime</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>synic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stimer</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reset</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vendor_id</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>frequencies</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reenlightenment</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tlbflush</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ipi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>avic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emsr_bitmap</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>xmm_input</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hyperv>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <launchSecurity supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: </domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.351 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 04:48:07 np0005591762 nova_compute[225313]: <domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <domain>kvm</domain>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <arch>i686</arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <vcpu max='4096'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <iothreads supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <os supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='firmware'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <loader supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>rom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pflash</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='readonly'>
Jan 22 04:48:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>yes</value>
Jan 22 04:48:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:07.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='secure'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </loader>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='maximumMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <vendor>AMD</vendor>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='succor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='custom' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <memoryBacking supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='sourceType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>anonymous</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>memfd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </memoryBacking>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <disk supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='diskDevice'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>disk</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cdrom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>floppy</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>lun</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>fdc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>sata</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <graphics supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vnc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egl-headless</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <video supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='modelType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vga</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cirrus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>none</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>bochs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ramfb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hostdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='mode'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>subsystem</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='startupPolicy'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>mandatory</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>requisite</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>optional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='subsysType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pci</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='capsType'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='pciBackend'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hostdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <rng supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>random</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <filesystem supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='driverType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>path</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>handle</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtiofs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </filesystem>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tpm supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-tis</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-crb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emulator</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>external</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendVersion'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>2.0</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </tpm>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <redirdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </redirdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <channel supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </channel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <crypto supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </crypto>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <interface supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>passt</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <panic supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>isa</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>hyperv</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </panic>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <console supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>null</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dev</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pipe</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stdio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>udp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tcp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu-vdagent</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <gic supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <genid supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backup supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <async-teardown supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <s390-pv supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <ps2 supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tdx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sev supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sgx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hyperv supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='features'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>relaxed</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vapic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>spinlocks</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vpindex</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>runtime</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>synic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stimer</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reset</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vendor_id</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>frequencies</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reenlightenment</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tlbflush</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ipi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>avic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emsr_bitmap</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>xmm_input</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hyperv>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <launchSecurity supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: </domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.379 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.381 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 04:48:07 np0005591762 nova_compute[225313]: <domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <domain>kvm</domain>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <arch>x86_64</arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <vcpu max='240'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <iothreads supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <os supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='firmware'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <loader supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>rom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pflash</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='readonly'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>yes</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='secure'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </loader>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='maximumMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <vendor>AMD</vendor>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='succor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='custom' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <memoryBacking supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='sourceType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>anonymous</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>memfd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </memoryBacking>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <disk supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='diskDevice'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>disk</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cdrom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>floppy</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>lun</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ide</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>fdc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>sata</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <graphics supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vnc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egl-headless</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <video supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='modelType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vga</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cirrus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>none</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>bochs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ramfb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hostdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='mode'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>subsystem</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='startupPolicy'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>mandatory</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>requisite</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>optional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='subsysType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pci</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='capsType'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='pciBackend'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hostdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <rng supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>random</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <filesystem supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='driverType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>path</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>handle</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtiofs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </filesystem>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tpm supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-tis</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-crb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emulator</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>external</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendVersion'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>2.0</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </tpm>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <redirdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </redirdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <channel supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </channel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <crypto supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </crypto>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <interface supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>passt</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <panic supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>isa</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>hyperv</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </panic>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <console supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>null</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dev</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pipe</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stdio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>udp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tcp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu-vdagent</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <gic supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <genid supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backup supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <async-teardown supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <s390-pv supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <ps2 supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tdx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sev supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sgx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hyperv supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='features'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>relaxed</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vapic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>spinlocks</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vpindex</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>runtime</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>synic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stimer</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reset</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vendor_id</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>frequencies</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reenlightenment</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tlbflush</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ipi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>avic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emsr_bitmap</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>xmm_input</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hyperv>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <launchSecurity supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: </domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.449 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 04:48:07 np0005591762 nova_compute[225313]: <domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <domain>kvm</domain>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <arch>x86_64</arch>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <vcpu max='4096'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <iothreads supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <os supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='firmware'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>efi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <loader supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>rom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pflash</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='readonly'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>yes</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='secure'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>yes</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>no</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </loader>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-passthrough' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='hostPassthroughMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='maximum' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='maximumMigratable'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>on</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>off</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='host-model' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <vendor>AMD</vendor>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <maxphysaddr mode='passthrough' limit='48'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='x2apic'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='hypervisor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vaes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='stibp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='overflow-recov'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='succor'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lbrv'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='tsc-scale'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='flushbyasid'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pause-filter'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='pfthreshold'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='v-vmsave-vmload'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='vgif'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <mode name='custom' supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Broadwell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='ClearwaterForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ddpd-u'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sha512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm3'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sm4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Cooperlake-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Denverton-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Milan-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='EPYC-Turin-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amd-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='auto-ibrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vp2intersect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fs-gs-base-ns'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibpb-brtype'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='no-nested-data-bp'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='null-sel-clr-base'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='perfmon-v2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbpb'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='srso-user-kernel-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='stibp-always-on'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='GraniteRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-128'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-256'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx10-512'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='prefetchiti'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Haswell-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v6'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Icelake-Server-v7'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='KnightsMill-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4fmaps'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-4vnniw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512er'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512pf'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G4-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Opteron_G5-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fma4'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tbm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xop'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SapphireRapids-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='amx-tile'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-bf16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-fp16'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512-vpopcntdq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bitalg'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vbmi2'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrc'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fzrm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='la57'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='taa-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='tsx-ldtrk'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='xfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='SierraForest-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ifma'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-ne-convert'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx-vnni-int8'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bhi-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='bus-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cmpccxadd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fbsdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='fsrs'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ibrs-all'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='intel-psfd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ipred-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='lam'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mcdt-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='pbrsb-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='psdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rrsba-ctrl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='sbdr-ssdp-no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='serialize'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Client-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='hle'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='rtm'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Skylake-Server-v5'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512bw'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512cd'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512dq'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512f'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='avx512vl'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='mpx'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v2'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v3'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='core-capability'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='split-lock-detect'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='Snowridge-v4'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='cldemote'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='gfni'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdir64b'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='movdiri'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='athlon-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='core2duo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='coreduo-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='n270-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='ss'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <blockers model='phenom-v1'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnow'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <feature name='3dnowext'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </blockers>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </mode>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <memoryBacking supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <enum name='sourceType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>anonymous</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <value>memfd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </memoryBacking>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <disk supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='diskDevice'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>disk</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cdrom</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>floppy</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>lun</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>fdc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>sata</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <graphics supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vnc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egl-headless</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <video supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='modelType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vga</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>cirrus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>none</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>bochs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ramfb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hostdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='mode'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>subsystem</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='startupPolicy'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>mandatory</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>requisite</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>optional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='subsysType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pci</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>scsi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='capsType'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='pciBackend'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hostdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <rng supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtio-non-transitional</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>random</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>egd</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <filesystem supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='driverType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>path</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>handle</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>virtiofs</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </filesystem>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tpm supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-tis</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tpm-crb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emulator</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>external</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendVersion'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>2.0</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </tpm>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <redirdev supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='bus'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>usb</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </redirdev>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <channel supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </channel>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <crypto supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendModel'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>builtin</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </crypto>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <interface supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='backendType'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>default</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>passt</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <panic supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='model'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>isa</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>hyperv</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </panic>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <console supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='type'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>null</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vc</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pty</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dev</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>file</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>pipe</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stdio</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>udp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tcp</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>unix</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>qemu-vdagent</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>dbus</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <gic supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <vmcoreinfo supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <genid supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backingStoreInput supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <backup supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <async-teardown supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <s390-pv supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <ps2 supported='yes'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <tdx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sev supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <sgx supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <hyperv supported='yes'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <enum name='features'>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>relaxed</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vapic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>spinlocks</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vpindex</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>runtime</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>synic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>stimer</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reset</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>vendor_id</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>frequencies</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>reenlightenment</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>tlbflush</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>ipi</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>avic</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>emsr_bitmap</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <value>xmm_input</value>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </enum>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      <defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <spinlocks>4095</spinlocks>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <stimer_direct>on</stimer_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:      </defaults>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    </hyperv>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:    <launchSecurity supported='no'/>
Jan 22 04:48:07 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: </domainCapabilities>
Jan 22 04:48:07 np0005591762 nova_compute[225313]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.504 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.505 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.505 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.505 225317 INFO nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Secure Boot support detected#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.507 225317 INFO nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.507 225317 INFO nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.514 225317 DEBUG nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.532 225317 INFO nova.virt.node [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Determined node identity 15be1e53-1c88-43bb-b33e-cd7166bd9713 from /var/lib/nova/compute_id#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.543 225317 WARNING nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Compute nodes ['15be1e53-1c88-43bb-b33e-cd7166bd9713'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.565 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.586 225317 WARNING nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.586 225317 DEBUG oslo_concurrency.lockutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.586 225317 DEBUG oslo_concurrency.lockutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.586 225317 DEBUG oslo_concurrency.lockutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.586 225317 DEBUG nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.586 225317 DEBUG oslo_concurrency.processutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:48:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:48:07 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/166273376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:48:07 np0005591762 nova_compute[225313]: 2026-01-22 09:48:07.941 225317 DEBUG oslo_concurrency.processutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:48:07 np0005591762 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 04:48:07 np0005591762 systemd[1]: Started libvirt nodedev daemon.
Jan 22 04:48:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:08 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.308 225317 WARNING nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.308 225317 DEBUG nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5280MB free_disk=59.98828125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.309 225317 DEBUG oslo_concurrency.lockutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.309 225317 DEBUG oslo_concurrency.lockutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.318 225317 WARNING nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] No compute node record for compute-2.ctlplane.example.com:15be1e53-1c88-43bb-b33e-cd7166bd9713: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 15be1e53-1c88-43bb-b33e-cd7166bd9713 could not be found.#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.328 225317 INFO nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 15be1e53-1c88-43bb-b33e-cd7166bd9713#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.361 225317 DEBUG nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.361 225317 DEBUG nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.441 225317 INFO nova.scheduler.client.report [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [req-40a4c3f2-3c48-43a0-ba02-9178e35431a4] Created resource provider record via placement API for resource provider with UUID 15be1e53-1c88-43bb-b33e-cd7166bd9713 and name compute-2.ctlplane.example.com.#033[00m
Jan 22 04:48:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:08.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:08 np0005591762 nova_compute[225313]: 2026-01-22 09:48:08.697 225317 DEBUG oslo_concurrency.processutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:48:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:08 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.050 225317 DEBUG oslo_concurrency.processutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.353s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.053 225317 DEBUG nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 22 04:48:09 np0005591762 nova_compute[225313]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.054 225317 INFO nova.virt.libvirt.host [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.054 225317 DEBUG nova.compute.provider_tree [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.055 225317 DEBUG nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.124 225317 DEBUG nova.scheduler.client.report [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Updated inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.124 225317 DEBUG nova.compute.provider_tree [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Updating resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.125 225317 DEBUG nova.compute.provider_tree [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.212 225317 DEBUG nova.compute.provider_tree [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Updating resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.236 225317 DEBUG nova.compute.resource_tracker [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.237 225317 DEBUG oslo_concurrency.lockutils [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.237 225317 DEBUG nova.service [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.290 225317 DEBUG nova.service [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 22 04:48:09 np0005591762 nova_compute[225313]: 2026-01-22 09:48:09.290 225317 DEBUG nova.servicegroup.drivers.db [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 22 04:48:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:09 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:09.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:10 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:10.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:10 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa0002600 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:11 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:48:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:11.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:48:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:12 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98001bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:12.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:12 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:13 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:13.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:14 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:14.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:14 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98001bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:15 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:15.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:16 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:16.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:16 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:17 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:17.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:18 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:18.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:18 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa0003140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:19 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98002c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:19.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:20 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:20.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:20 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:21 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:21.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:21 np0005591762 podman[225707]: 2026-01-22 09:48:21.818024281 +0000 UTC m=+0.040291369 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 04:48:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:22 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98002c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:22.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:22 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa0003af0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:23 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:23.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:24.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:24 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98002c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:25 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa0003af0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:25.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:26 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:48:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:26.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:48:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:26 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:27 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98003d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:28 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:28.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:28 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55f02ed2d970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:29 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:29.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 04:48:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1531068086' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 04:48:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 04:48:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1531068086' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 04:48:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:30 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:30.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:30 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:30 np0005591762 podman[225735]: 2026-01-22 09:48:30.838218519 +0000 UTC m=+0.062007475 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 04:48:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:31 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:31.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:32 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:32.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:32 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a90006ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:33 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a98004660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:33.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:34 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004360 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:34.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:34 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:35 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:48:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:35.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:48:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:36.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:36 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:37 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:37.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:38 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:48:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:38.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:48:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:38 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:39 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:39.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:39 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:48:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:48:39 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:48:39 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:48:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:40 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:40.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:40 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:41 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:41.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:42 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:42.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:42 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:43 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:43.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:48:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:48:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:44 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:44.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:44 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:45 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:45.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:46 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:46.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:46 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:48:47.190 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:48:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:48:47.190 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:48:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:48:47.190 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:48:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:47 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:47.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:48 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:48.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:48 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac004c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:49 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:49.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.650300) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075329650325, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4565, "num_deletes": 502, "total_data_size": 12352877, "memory_usage": 12524376, "flush_reason": "Manual Compaction"}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075329665540, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 7991214, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13619, "largest_seqno": 18179, "table_properties": {"data_size": 7973990, "index_size": 11542, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4613, "raw_key_size": 36130, "raw_average_key_size": 19, "raw_value_size": 7938239, "raw_average_value_size": 4335, "num_data_blocks": 504, "num_entries": 1831, "num_filter_entries": 1831, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074909, "oldest_key_time": 1769074909, "file_creation_time": 1769075329, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 15312 microseconds, and 10146 cpu microseconds.
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.665616) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 7991214 bytes OK
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.665658) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.666000) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.666011) EVENT_LOG_v1 {"time_micros": 1769075329666008, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.666021) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12332864, prev total WAL file size 12332864, number of live WAL files 2.
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.667850) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(7803KB)], [27(11MB)]
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075329667876, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 20163697, "oldest_snapshot_seqno": -1}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5137 keys, 15236266 bytes, temperature: kUnknown
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075329699599, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15236266, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15197734, "index_size": 24598, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 128541, "raw_average_key_size": 25, "raw_value_size": 15100459, "raw_average_value_size": 2939, "num_data_blocks": 1034, "num_entries": 5137, "num_filter_entries": 5137, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769075329, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.699929) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15236266 bytes
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.702056) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 631.0 rd, 476.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(7.6, 11.6 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(4.4) write-amplify(1.9) OK, records in: 6160, records dropped: 1023 output_compression: NoCompression
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.702069) EVENT_LOG_v1 {"time_micros": 1769075329702063, "job": 14, "event": "compaction_finished", "compaction_time_micros": 31953, "compaction_time_cpu_micros": 21152, "output_level": 6, "num_output_files": 1, "total_output_size": 15236266, "num_input_records": 6160, "num_output_records": 5137, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075329703071, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075329704395, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.667809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.704454) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.704457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.704459) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.704460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:48:49 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:48:49.704462) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:48:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:50 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:50.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:50 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:51 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:51.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:52 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:52.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:52 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:52 np0005591762 podman[225934]: 2026-01-22 09:48:52.810919615 +0000 UTC m=+0.034562714 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 04:48:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:53 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac006790 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:53.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:54 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:48:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:54.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:48:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:54 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:55 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:55.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:56 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:48:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:56.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:48:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:56 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:48:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:57 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:57.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:58 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aa00049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:48:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:48:58.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:48:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:58 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:48:59 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac006790 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:48:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:48:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:48:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:48:59.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:48:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:48:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:48:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:48:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:49:00 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac006790 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:00.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:49:00 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4aac006790 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:01 np0005591762 kernel: ganesha.nfsd[225647]: segfault at 50 ip 00007f4b1fb5132e sp 00007f4aa5ffa210 error 4 in libntirpc.so.5.8[7f4b1fb36000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 22 04:49:01 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:49:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[218535]: 22/01/2026 09:49:01 : epoch 6971f22c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4a900079d0 fd 38 proxy ignored for local
Jan 22 04:49:01 np0005591762 systemd[1]: Started Process Core Dump (PID 225960/UID 0).
Jan 22 04:49:01 np0005591762 podman[225961]: 2026-01-22 09:49:01.429205048 +0000 UTC m=+0.060919302 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 04:49:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:01.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:02 np0005591762 systemd-coredump[225962]: Process 218539 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007f4b1fb5132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:49:02 np0005591762 systemd[1]: systemd-coredump@9-225960-0.service: Deactivated successfully.
Jan 22 04:49:02 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:49:02 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 04:49:02 np0005591762 podman[225990]: 2026-01-22 09:49:02.405125857 +0000 UTC m=+0.017103552 container died 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:49:02 np0005591762 systemd[1]: var-lib-containers-storage-overlay-a5e36e4297e1ebc9482605461bf93afa51fd40a5a28ed4d1fc4f8ba22fc0adf0-merged.mount: Deactivated successfully.
Jan 22 04:49:02 np0005591762 podman[225990]: 2026-01-22 09:49:02.42399136 +0000 UTC m=+0.035969035 container remove 1f6067c0837d352da9e43f88effc99695618810b2edcbbd6e8b2027697899e40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Jan 22 04:49:02 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:49:02 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:49:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:02.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:03.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:04 np0005591762 nova_compute[225313]: 2026-01-22 09:49:04.291 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:04 np0005591762 nova_compute[225313]: 2026-01-22 09:49:04.311 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:04.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:05.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:06.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.724 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.724 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.750 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.750 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.751 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.751 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.751 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.751 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.752 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.752 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.752 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.779 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.779 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.779 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.779 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:49:06 np0005591762 nova_compute[225313]: 2026-01-22 09:49:06.780 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:49:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:49:07 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/160793156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.116 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.313 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.314 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5289MB free_disk=59.98828125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.315 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.315 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:49:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094907 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:49:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:07.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.791 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.792 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:49:07 np0005591762 nova_compute[225313]: 2026-01-22 09:49:07.823 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:49:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:49:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3698669258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:49:08 np0005591762 nova_compute[225313]: 2026-01-22 09:49:08.159 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:49:08 np0005591762 nova_compute[225313]: 2026-01-22 09:49:08.163 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:49:08 np0005591762 nova_compute[225313]: 2026-01-22 09:49:08.199 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:49:08 np0005591762 nova_compute[225313]: 2026-01-22 09:49:08.200 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:49:08 np0005591762 nova_compute[225313]: 2026-01-22 09:49:08.200 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:49:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 22 04:49:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2611846470' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 22 04:49:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 22 04:49:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2319385338' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 22 04:49:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:08.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:09.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:10.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:11.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:49:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:12.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:49:12 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 10.
Jan 22 04:49:12 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:49:12 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:49:12 np0005591762 podman[226140]: 2026-01-22 09:49:12.898932542 +0000 UTC m=+0.027059095 container create b39ad1e313312e9cd25d9b1c4eab998b8a34488c80e7d7b717a3a613d1f7dcf5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Jan 22 04:49:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b26bddc831626b04e93cbff85b986a7f406d08159742f1c92f96444019508684/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:49:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b26bddc831626b04e93cbff85b986a7f406d08159742f1c92f96444019508684/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:49:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b26bddc831626b04e93cbff85b986a7f406d08159742f1c92f96444019508684/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:49:12 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b26bddc831626b04e93cbff85b986a7f406d08159742f1c92f96444019508684/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:49:12 np0005591762 podman[226140]: 2026-01-22 09:49:12.9326017 +0000 UTC m=+0.060728274 container init b39ad1e313312e9cd25d9b1c4eab998b8a34488c80e7d7b717a3a613d1f7dcf5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 04:49:12 np0005591762 podman[226140]: 2026-01-22 09:49:12.938498269 +0000 UTC m=+0.066624822 container start b39ad1e313312e9cd25d9b1c4eab998b8a34488c80e7d7b717a3a613d1f7dcf5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:49:12 np0005591762 bash[226140]: b39ad1e313312e9cd25d9b1c4eab998b8a34488c80e7d7b717a3a613d1f7dcf5
Jan 22 04:49:12 np0005591762 podman[226140]: 2026-01-22 09:49:12.887820057 +0000 UTC m=+0.015946620 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:49:12 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:49:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:12 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:49:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:13.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:14.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:15.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:16.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:17.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:18.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:18 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:49:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:18 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:49:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:19.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:20.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:21.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:22.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:23.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:23 np0005591762 podman[226207]: 2026-01-22 09:49:23.818095157 +0000 UTC m=+0.039162488 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 04:49:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:24.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:25 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f169c000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:26 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:26.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:26 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/094927 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:49:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:27 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1690001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:27.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:28 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1690001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:28.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:28 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:29 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 22 04:49:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1993541143' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 22 04:49:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:29.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:30 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1690001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:30.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:30 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f168c001ee0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:31 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:31.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:31 np0005591762 podman[226271]: 2026-01-22 09:49:31.832566833 +0000 UTC m=+0.055106743 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:49:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:32 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:32.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:32 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:33 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f168c002a00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:33.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:34 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:34 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:35 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:35.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:36 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f168c002a00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:36.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:36 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688004b50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:37 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688004b50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:37.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:38 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:38.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:38 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:39 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688004b50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:39.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:40 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688004b50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:40.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:40 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:41 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:41.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:42 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:42.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:42 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:43 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:43.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:43 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:49:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:49:43 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:49:43 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:49:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:44 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900030c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:44.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:44 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:45 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:45.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:46 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:49:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:46.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:49:46 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:49:46 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:49:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:46 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:49:47.190 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:49:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:49:47.191 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:49:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:49:47.191 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:49:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:47 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:47.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:48 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:48.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:48 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:49 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:49.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:50 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:50.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:50 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:51 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:51.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:52 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:52.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:52 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:53 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f168c003c20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:53.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:54 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f168c003c20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:54.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:54 np0005591762 podman[226447]: 2026-01-22 09:49:54.814157523 +0000 UTC m=+0.037406324 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:49:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:54 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:55 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:55.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:56 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:56.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:56 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006040 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:49:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:57 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f168c003c20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:57.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:58 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16a4002600 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:49:58.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:58 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:49:59 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006d70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:49:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:49:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:49:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:49:59.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:49:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:49:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:49:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:49:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:00 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006d70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:50:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:50:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:50:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:50:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:50:00.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:50:00 np0005591762 ceph-mon[75519]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Jan 22 04:50:00 np0005591762 ceph-mon[75519]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Jan 22 04:50:00 np0005591762 ceph-mon[75519]:    daemon nfs.cephfs.0.0.compute-1.pszzrs on compute-1 is in unknown state
Jan 22 04:50:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:00 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16a4003140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:01 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:50:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:50:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:50:01.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:50:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:50:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:50:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:50:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:02 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:50:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:50:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:50:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:50:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:50:02.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:50:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:02 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1688006d70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:02 np0005591762 podman[226472]: 2026-01-22 09:50:02.870195441 +0000 UTC m=+0.092953273 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:50:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:03 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16a4003140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:50:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:50:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:50:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:50:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:50:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:50:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[226154]: 22/01/2026 09:50:04 : epoch 6971f298 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f16900048b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:50:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:50:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:50:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:50:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:52:57 np0005591762 nova_compute[225313]: 2026-01-22 09:52:57.963 225317 INFO nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Creating config drive at /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2/disk.config#033[00m
Jan 22 04:52:57 np0005591762 nova_compute[225313]: 2026-01-22 09:52:57.967 225317 DEBUG oslo_concurrency.processutils [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmiyqy30w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:52:58 np0005591762 rsyslogd[963]: imjournal: 1925 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.088 225317 DEBUG oslo_concurrency.processutils [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmiyqy30w" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.106 225317 DEBUG nova.storage.rbd_utils [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image f8af2f33-afb8-40b5-8850-a24d410bdae2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.108 225317 DEBUG oslo_concurrency.processutils [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2/disk.config f8af2f33-afb8-40b5-8850-a24d410bdae2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.190 225317 DEBUG oslo_concurrency.processutils [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2/disk.config f8af2f33-afb8-40b5-8850-a24d410bdae2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.190 225317 INFO nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Deleting local config drive /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2/disk.config because it was imported into RBD.#033[00m
Jan 22 04:52:58 np0005591762 systemd[1]: Starting libvirt secret daemon...
Jan 22 04:52:58 np0005591762 systemd[1]: Started libvirt secret daemon.
Jan 22 04:52:58 np0005591762 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 22 04:52:58 np0005591762 NetworkManager[48910]: <info>  [1769075578.2556] manager: (tap31d9d6b3-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 22 04:52:58 np0005591762 kernel: tap31d9d6b3-c1: entered promiscuous mode
Jan 22 04:52:58 np0005591762 ovn_controller[133622]: 2026-01-22T09:52:58Z|00027|binding|INFO|Claiming lport 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d for this chassis.
Jan 22 04:52:58 np0005591762 ovn_controller[133622]: 2026-01-22T09:52:58Z|00028|binding|INFO|31d9d6b3-c161-4ddb-8d18-682f61a7fd7d: Claiming fa:16:3e:e5:82:3d 10.100.0.25
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.261 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.267 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:82:3d 10.100.0.25'], port_security=['fa:16:3e:e5:82:3d 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'f8af2f33-afb8-40b5-8850-a24d410bdae2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aad0ad0-c976-429e-adb6-82f8246f3816', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c95f7a36-2698-48be-a6da-4dc0238209e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcbb2c42-2995-4891-bbde-7210d9fdc575, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=31d9d6b3-c161-4ddb-8d18-682f61a7fd7d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.268 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d in datapath 7aad0ad0-c976-429e-adb6-82f8246f3816 bound to our chassis#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.270 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7aad0ad0-c976-429e-adb6-82f8246f3816#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.270 143150 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp_7b0w06u/privsep.sock']#033[00m
Jan 22 04:52:58 np0005591762 systemd-udevd[228160]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:52:58 np0005591762 NetworkManager[48910]: <info>  [1769075578.3104] device (tap31d9d6b3-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:52:58 np0005591762 NetworkManager[48910]: <info>  [1769075578.3110] device (tap31d9d6b3-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.313 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:52:58 np0005591762 ovn_controller[133622]: 2026-01-22T09:52:58Z|00029|binding|INFO|Setting lport 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d ovn-installed in OVS
Jan 22 04:52:58 np0005591762 ovn_controller[133622]: 2026-01-22T09:52:58Z|00030|binding|INFO|Setting lport 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d up in Southbound
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.317 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:52:58 np0005591762 systemd-machined[193990]: New machine qemu-1-instance-00000002.
Jan 22 04:52:58 np0005591762 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 22 04:52:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:52:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:52:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:52:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.702 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075578.7024736, f8af2f33-afb8-40b5-8850-a24d410bdae2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.703 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] VM Started (Lifecycle Event)#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.717 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.719 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075578.7026355, f8af2f33-afb8-40b5-8850-a24d410bdae2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.719 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] VM Paused (Lifecycle Event)#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.732 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.734 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:52:58 np0005591762 nova_compute[225313]: 2026-01-22 09:52:58.746 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:52:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:52:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:52:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:52:58 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.828 143150 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.829 143150 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_7b0w06u/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.749 228218 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.752 228218 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.754 228218 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.754 228218 INFO oslo.privsep.daemon [-] privsep daemon running as pid 228218#033[00m
Jan 22 04:52:58 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:58.831 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4c792a-24f0-4593-88d1-2b28c10bea69]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:52:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:52:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:52:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:52:58.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.009 225317 DEBUG nova.compute.manager [req-a453a336-7545-44e9-a875-63898a7877de req-39063050-aeb7-46ac-b6c6-dc8766df2ef5 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received event network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.010 225317 DEBUG oslo_concurrency.lockutils [req-a453a336-7545-44e9-a875-63898a7877de req-39063050-aeb7-46ac-b6c6-dc8766df2ef5 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.010 225317 DEBUG oslo_concurrency.lockutils [req-a453a336-7545-44e9-a875-63898a7877de req-39063050-aeb7-46ac-b6c6-dc8766df2ef5 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.010 225317 DEBUG oslo_concurrency.lockutils [req-a453a336-7545-44e9-a875-63898a7877de req-39063050-aeb7-46ac-b6c6-dc8766df2ef5 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.010 225317 DEBUG nova.compute.manager [req-a453a336-7545-44e9-a875-63898a7877de req-39063050-aeb7-46ac-b6c6-dc8766df2ef5 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Processing event network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.011 225317 DEBUG nova.compute.manager [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.019 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075579.0182304, f8af2f33-afb8-40b5-8850-a24d410bdae2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.019 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] VM Resumed (Lifecycle Event)#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.020 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.022 225317 INFO nova.virt.libvirt.driver [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Instance spawned successfully.#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.022 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.056 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.060 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.062 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.063 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.063 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.063 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.064 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.064 225317 DEBUG nova.virt.libvirt.driver [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.118 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.185 225317 INFO nova.compute.manager [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Took 9.45 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.186 225317 DEBUG nova.compute.manager [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.189 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.190 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.241 225317 INFO nova.compute.manager [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Took 10.08 seconds to build instance.#033[00m
Jan 22 04:52:59 np0005591762 nova_compute[225313]: 2026-01-22 09:52:59.263 225317 DEBUG oslo_concurrency.lockutils [None req-0d8955c9-2574-42df-971e-2c2dae1dafb8 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.340 228218 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.341 228218 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.341 228218 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:52:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8448000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:52:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:52:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:52:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:52:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:52:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:52:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:52:59.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.871 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[2d039585-4787-45a3-b180-0c322fd9423a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.872 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7aad0ad0-c1 in ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.874 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7aad0ad0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.874 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed3226b-2633-408b-8a84-f2f77f6c682b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.877 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7894ba-f94b-4dca-9f01-079aeb1a44a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.896 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3a1e7b-aa42-488b-9d56-55e954b19141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.913 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfef272-41f8-408a-b1ba-9bd75ac19ddf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:52:59 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:52:59.914 143150 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp9fwjkj4x/privsep.sock']#033[00m
Jan 22 04:52:59 np0005591762 podman[228243]: 2026-01-22 09:52:59.986852602 +0000 UTC m=+0.071583943 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 04:53:00 np0005591762 nova_compute[225313]: 2026-01-22 09:53:00.158 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:00 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8434001e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.454 143150 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.454 143150 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9fwjkj4x/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.378 228264 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.384 228264 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.388 228264 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.389 228264 INFO oslo.privsep.daemon [-] privsep daemon running as pid 228264#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.456 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[23584548-2445-4f9a-81ae-b9d1962db165]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:00.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.873 228264 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.874 228264 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:00 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:00.874 228264 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:00 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.093 225317 DEBUG nova.compute.manager [req-d3bc9918-08d2-449e-9717-16afe3853a05 req-c40f7682-e2c9-46b1-be60-8d509bbe40c6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received event network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.095 225317 DEBUG oslo_concurrency.lockutils [req-d3bc9918-08d2-449e-9717-16afe3853a05 req-c40f7682-e2c9-46b1-be60-8d509bbe40c6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.095 225317 DEBUG oslo_concurrency.lockutils [req-d3bc9918-08d2-449e-9717-16afe3853a05 req-c40f7682-e2c9-46b1-be60-8d509bbe40c6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.095 225317 DEBUG oslo_concurrency.lockutils [req-d3bc9918-08d2-449e-9717-16afe3853a05 req-c40f7682-e2c9-46b1-be60-8d509bbe40c6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.096 225317 DEBUG nova.compute.manager [req-d3bc9918-08d2-449e-9717-16afe3853a05 req-c40f7682-e2c9-46b1-be60-8d509bbe40c6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] No waiting events found dispatching network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.096 225317 WARNING nova.compute.manager [req-d3bc9918-08d2-449e-9717-16afe3853a05 req-c40f7682-e2c9-46b1-be60-8d509bbe40c6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received unexpected event network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d for instance with vm_state active and task_state None.#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.360 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[7878e7ac-7915-41f2-9b62-93c71ae948cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 NetworkManager[48910]: <info>  [1769075581.3775] manager: (tap7aad0ad0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.378 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e1868505-cab1-4502-9b2d-e52ed4f8099b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 systemd-udevd[228278]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.407 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[970d896e-21ef-4054-a4c0-4493386c48e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.409 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[4271e01b-410b-4270-b6de-170b8619a1eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 NetworkManager[48910]: <info>  [1769075581.4299] device (tap7aad0ad0-c0): carrier: link connected
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.434 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[bf646679-e351-46f1-93a4-53ada87f8ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.451 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e2bde225-bce3-4ac7-97a2-bffa31f5e567]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7aad0ad0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:bf:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 315898, 'reachable_time': 27026, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228289, 'error': None, 'target': 'ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.464 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9dc9bf-3a79-4943-900c-c15de6934691]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:bf14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 315898, 'tstamp': 315898}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228291, 'error': None, 'target': 'ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095301 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:53:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:01 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8434001e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.478 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[a4112b5b-f33f-44c9-8294-096cbf7b2ba0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7aad0ad0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:bf:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 315898, 'reachable_time': 27026, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228292, 'error': None, 'target': 'ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.498 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c03a14-917d-4532-adb7-5c86f9a2d5d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.539 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[7e151002-e6c1-4ab4-8793-1c3225bfb8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.540 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aad0ad0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.540 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.542 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7aad0ad0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:53:01 np0005591762 NetworkManager[48910]: <info>  [1769075581.5439] manager: (tap7aad0ad0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 22 04:53:01 np0005591762 kernel: tap7aad0ad0-c0: entered promiscuous mode
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.544 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.547 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.548 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7aad0ad0-c0, col_values=(('external_ids', {'iface-id': '82b7f732-41c1-485c-bd1b-1d1c4af2483c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.549 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:01 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:01Z|00031|binding|INFO|Releasing lport 82b7f732-41c1-485c-bd1b-1d1c4af2483c from this chassis (sb_readonly=0)
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.551 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7aad0ad0-c976-429e-adb6-82f8246f3816.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7aad0ad0-c976-429e-adb6-82f8246f3816.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.552 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e58d8460-7b63-4f7b-bbbb-0e9d46830705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.553 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-7aad0ad0-c976-429e-adb6-82f8246f3816
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/7aad0ad0-c976-429e-adb6-82f8246f3816.pid.haproxy
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID 7aad0ad0-c976-429e-adb6-82f8246f3816
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.553 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816', 'env', 'PROCESS_TAG=haproxy-7aad0ad0-c976-429e-adb6-82f8246f3816', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7aad0ad0-c976-429e-adb6-82f8246f3816.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.563 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:01.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:01 np0005591762 nova_compute[225313]: 2026-01-22 09:53:01.832 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:01 np0005591762 podman[228321]: 2026-01-22 09:53:01.848583212 +0000 UTC m=+0.038648510 container create c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 04:53:01 np0005591762 systemd[1]: Started libpod-conmon-c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325.scope.
Jan 22 04:53:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:01 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:53:01 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1db2981bd7ea9a466dc42a213495b63fecae4282a68032d9fb35a9feb22fe4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:53:01 np0005591762 podman[228321]: 2026-01-22 09:53:01.919669837 +0000 UTC m=+0.109735154 container init c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 04:53:01 np0005591762 podman[228321]: 2026-01-22 09:53:01.926022385 +0000 UTC m=+0.116087682 container start c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 04:53:01 np0005591762 podman[228321]: 2026-01-22 09:53:01.82995654 +0000 UTC m=+0.020021836 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:53:01 np0005591762 neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816[228332]: [NOTICE]   (228351) : New worker (228361) forked
Jan 22 04:53:01 np0005591762 neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816[228332]: [NOTICE]   (228351) : Loading success.
Jan 22 04:53:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:01.985 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:53:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:02 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:02 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:53:02 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:53:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:02.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:02 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:03 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:03.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:04 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:53:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:53:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:04 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095305 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:53:05 np0005591762 nova_compute[225313]: 2026-01-22 09:53:05.160 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:05 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8434002dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:05.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:06 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5447] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5452] device (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <warn>  [1769075586.5453] device (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5460] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5463] device (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <warn>  [1769075586.5464] device (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5470] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.542 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5478] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5482] device (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 04:53:06 np0005591762 NetworkManager[48910]: <info>  [1769075586.5492] device (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.622 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:06 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:06Z|00032|binding|INFO|Releasing lport 82b7f732-41c1-485c-bd1b-1d1c4af2483c from this chassis (sb_readonly=0)
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.628 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.733 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.734 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.734 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.741 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:06 np0005591762 nova_compute[225313]: 2026-01-22 09:53:06.833 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:06.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:06 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:06.987 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:53:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:06 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:07 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:07.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:08 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:08 np0005591762 nova_compute[225313]: 2026-01-22 09:53:08.747 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:08 np0005591762 nova_compute[225313]: 2026-01-22 09:53:08.749 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:08 np0005591762 nova_compute[225313]: 2026-01-22 09:53:08.749 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:08 np0005591762 podman[228377]: 2026-01-22 09:53:08.841885917 +0000 UTC m=+0.059376484 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 04:53:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:08.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:08 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:09Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:82:3d 10.100.0.25
Jan 22 04:53:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:09Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:82:3d 10.100.0.25
Jan 22 04:53:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:09 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.720 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.721 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:53:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:09.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.856 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "refresh_cache-f8af2f33-afb8-40b5-8850-a24d410bdae2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.857 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquired lock "refresh_cache-f8af2f33-afb8-40b5-8850-a24d410bdae2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.857 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 04:53:09 np0005591762 nova_compute[225313]: 2026-01-22 09:53:09.857 225317 DEBUG nova.objects.instance [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f8af2f33-afb8-40b5-8850-a24d410bdae2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:53:10 np0005591762 nova_compute[225313]: 2026-01-22 09:53:10.161 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:10 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:10.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:10 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.165 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Updating instance_info_cache with network_info: [{"id": "31d9d6b3-c161-4ddb-8d18-682f61a7fd7d", "address": "fa:16:3e:e5:82:3d", "network": {"id": "7aad0ad0-c976-429e-adb6-82f8246f3816", "bridge": "br-int", "label": "tempest-network-smoke--1851471753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d9d6b3-c1", "ovs_interfaceid": "31d9d6b3-c161-4ddb-8d18-682f61a7fd7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.180 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Releasing lock "refresh_cache-f8af2f33-afb8-40b5-8850-a24d410bdae2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.180 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.180 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.181 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.181 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:53:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:11 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:11.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:11 np0005591762 nova_compute[225313]: 2026-01-22 09:53:11.834 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:12 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8448008cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:12 np0005591762 nova_compute[225313]: 2026-01-22 09:53:12.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:12.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:13 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8448008cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:13 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:13 np0005591762 nova_compute[225313]: 2026-01-22 09:53:13.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:53:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:13.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:13 np0005591762 nova_compute[225313]: 2026-01-22 09:53:13.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:13 np0005591762 nova_compute[225313]: 2026-01-22 09:53:13.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:13 np0005591762 nova_compute[225313]: 2026-01-22 09:53:13.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:13 np0005591762 nova_compute[225313]: 2026-01-22 09:53:13.738 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:53:13 np0005591762 nova_compute[225313]: 2026-01-22 09:53:13.738 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:53:14 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:53:14 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2794436243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.079 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.124 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.124 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.335 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.337 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4810MB free_disk=59.89728927612305GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.337 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.338 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.437 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Instance f8af2f33-afb8-40b5-8850-a24d410bdae2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.437 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.438 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:53:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:14 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8448009b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.482 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing inventories for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.529 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating ProviderTree inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.529 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.548 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing aggregate associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.566 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing trait associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, traits: HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX512VAES,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AESNI,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.591 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:53:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:14.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.926 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.930 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.967 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updated inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.967 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.968 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.981 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:53:14 np0005591762 nova_compute[225313]: 2026-01-22 09:53:14.982 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:15 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8448009b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:15 np0005591762 nova_compute[225313]: 2026-01-22 09:53:15.163 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:15 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8448009b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:15.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.268 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "f8af2f33-afb8-40b5-8850-a24d410bdae2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.269 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.269 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.269 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.269 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.270 225317 INFO nova.compute.manager [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Terminating instance#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.271 225317 DEBUG nova.compute.manager [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 04:53:16 np0005591762 kernel: tap31d9d6b3-c1 (unregistering): left promiscuous mode
Jan 22 04:53:16 np0005591762 NetworkManager[48910]: <info>  [1769075596.3178] device (tap31d9d6b3-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 04:53:16 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:16Z|00033|binding|INFO|Releasing lport 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d from this chassis (sb_readonly=0)
Jan 22 04:53:16 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:16Z|00034|binding|INFO|Setting lport 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d down in Southbound
Jan 22 04:53:16 np0005591762 ovn_controller[133622]: 2026-01-22T09:53:16Z|00035|binding|INFO|Removing iface tap31d9d6b3-c1 ovn-installed in OVS
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.328 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.330 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:82:3d 10.100.0.25'], port_security=['fa:16:3e:e5:82:3d 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'f8af2f33-afb8-40b5-8850-a24d410bdae2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aad0ad0-c976-429e-adb6-82f8246f3816', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c95f7a36-2698-48be-a6da-4dc0238209e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcbb2c42-2995-4891-bbde-7210d9fdc575, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=31d9d6b3-c161-4ddb-8d18-682f61a7fd7d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.331 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 31d9d6b3-c161-4ddb-8d18-682f61a7fd7d in datapath 7aad0ad0-c976-429e-adb6-82f8246f3816 unbound from our chassis#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.333 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7aad0ad0-c976-429e-adb6-82f8246f3816, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.334 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[a31088bb-dfc8-4b40-bab2-246967da4946]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.335 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816 namespace which is not needed anymore#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.343 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:16 np0005591762 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 22 04:53:16 np0005591762 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 10.630s CPU time.
Jan 22 04:53:16 np0005591762 systemd-machined[193990]: Machine qemu-1-instance-00000002 terminated.
Jan 22 04:53:16 np0005591762 neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816[228332]: [NOTICE]   (228351) : haproxy version is 2.8.14-c23fe91
Jan 22 04:53:16 np0005591762 neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816[228332]: [NOTICE]   (228351) : path to executable is /usr/sbin/haproxy
Jan 22 04:53:16 np0005591762 neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816[228332]: [ALERT]    (228351) : Current worker (228361) exited with code 143 (Terminated)
Jan 22 04:53:16 np0005591762 neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816[228332]: [WARNING]  (228351) : All workers exited. Exiting... (0)
Jan 22 04:53:16 np0005591762 systemd[1]: libpod-c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325.scope: Deactivated successfully.
Jan 22 04:53:16 np0005591762 podman[228501]: 2026-01-22 09:53:16.438984879 +0000 UTC m=+0.033959577 container died c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 04:53:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:16 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:16 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325-userdata-shm.mount: Deactivated successfully.
Jan 22 04:53:16 np0005591762 systemd[1]: var-lib-containers-storage-overlay-bc1db2981bd7ea9a466dc42a213495b63fecae4282a68032d9fb35a9feb22fe4-merged.mount: Deactivated successfully.
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.461 225317 DEBUG nova.compute.manager [req-2e096eff-0fc8-4030-90eb-ed4c40453407 req-dd7519d0-09ca-4a53-8c66-a1b1b4b0553e e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received event network-vif-unplugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.462 225317 DEBUG oslo_concurrency.lockutils [req-2e096eff-0fc8-4030-90eb-ed4c40453407 req-dd7519d0-09ca-4a53-8c66-a1b1b4b0553e e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.462 225317 DEBUG oslo_concurrency.lockutils [req-2e096eff-0fc8-4030-90eb-ed4c40453407 req-dd7519d0-09ca-4a53-8c66-a1b1b4b0553e e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.462 225317 DEBUG oslo_concurrency.lockutils [req-2e096eff-0fc8-4030-90eb-ed4c40453407 req-dd7519d0-09ca-4a53-8c66-a1b1b4b0553e e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.462 225317 DEBUG nova.compute.manager [req-2e096eff-0fc8-4030-90eb-ed4c40453407 req-dd7519d0-09ca-4a53-8c66-a1b1b4b0553e e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] No waiting events found dispatching network-vif-unplugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.462 225317 DEBUG nova.compute.manager [req-2e096eff-0fc8-4030-90eb-ed4c40453407 req-dd7519d0-09ca-4a53-8c66-a1b1b4b0553e e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received event network-vif-unplugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 04:53:16 np0005591762 podman[228501]: 2026-01-22 09:53:16.466364468 +0000 UTC m=+0.061339165 container cleanup c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 04:53:16 np0005591762 systemd[1]: libpod-conmon-c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325.scope: Deactivated successfully.
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.493 225317 INFO nova.virt.libvirt.driver [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Instance destroyed successfully.#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.493 225317 DEBUG nova.objects.instance [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'resources' on Instance uuid f8af2f33-afb8-40b5-8850-a24d410bdae2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.505 225317 DEBUG nova.virt.libvirt.vif [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:52:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2017243371',display_name='tempest-TestNetworkBasicOps-server-2017243371',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2017243371',id=2,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLbcFw9HGgrdmDIv6T4O0iPEEnfCF6yuT7yBBfHklto7oT1hm+s0H43ED6qjH75taJ4OK4fDgXUY6GnO+UtqmZxARK6jOuNSDEgdESoO3drpDVSZhsPADTjhcCQaC44PEA==',key_name='tempest-TestNetworkBasicOps-1509494119',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:52:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-2444lqvr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:52:59Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=f8af2f33-afb8-40b5-8850-a24d410bdae2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31d9d6b3-c161-4ddb-8d18-682f61a7fd7d", "address": "fa:16:3e:e5:82:3d", "network": {"id": "7aad0ad0-c976-429e-adb6-82f8246f3816", "bridge": "br-int", "label": "tempest-network-smoke--1851471753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d9d6b3-c1", "ovs_interfaceid": "31d9d6b3-c161-4ddb-8d18-682f61a7fd7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.506 225317 DEBUG nova.network.os_vif_util [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "31d9d6b3-c161-4ddb-8d18-682f61a7fd7d", "address": "fa:16:3e:e5:82:3d", "network": {"id": "7aad0ad0-c976-429e-adb6-82f8246f3816", "bridge": "br-int", "label": "tempest-network-smoke--1851471753", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d9d6b3-c1", "ovs_interfaceid": "31d9d6b3-c161-4ddb-8d18-682f61a7fd7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.507 225317 DEBUG nova.network.os_vif_util [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:82:3d,bridge_name='br-int',has_traffic_filtering=True,id=31d9d6b3-c161-4ddb-8d18-682f61a7fd7d,network=Network(7aad0ad0-c976-429e-adb6-82f8246f3816),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d9d6b3-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.508 225317 DEBUG os_vif [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:82:3d,bridge_name='br-int',has_traffic_filtering=True,id=31d9d6b3-c161-4ddb-8d18-682f61a7fd7d,network=Network(7aad0ad0-c976-429e-adb6-82f8246f3816),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d9d6b3-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.509 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.509 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31d9d6b3-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.512 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:53:16 np0005591762 podman[228525]: 2026-01-22 09:53:16.512573831 +0000 UTC m=+0.028997170 container remove c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.514 225317 INFO os_vif [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:82:3d,bridge_name='br-int',has_traffic_filtering=True,id=31d9d6b3-c161-4ddb-8d18-682f61a7fd7d,network=Network(7aad0ad0-c976-429e-adb6-82f8246f3816),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d9d6b3-c1')#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.517 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[561dfc1f-5ca8-4f67-aa9e-bcaef8c5068a]: (4, ('Thu Jan 22 09:53:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816 (c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325)\nc12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325\nThu Jan 22 09:53:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816 (c12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325)\nc12028a56e048a5eea6175e0ace40b59d37f4a5b04a571ee96305be2e77b7325\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.518 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[492a805f-d854-4f12-99e9-9d55dc3cb011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 kernel: tap7aad0ad0-c0: left promiscuous mode
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.521 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aad0ad0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.538 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.539 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b00438-19a9-4d5b-9b30-67950745a64b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.547 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[431ae248-7e1b-4d41-bfe0-0b2e076de38d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.548 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[db50040a-a82f-4f2f-b239-f8ea7c71bc03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.560 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[69ed4b58-c337-4c00-87de-a3dff4ac35ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 315891, 'reachable_time': 17099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228564, 'error': None, 'target': 'ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 systemd[1]: run-netns-ovnmeta\x2d7aad0ad0\x2dc976\x2d429e\x2dadb6\x2d82f8246f3816.mount: Deactivated successfully.
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.569 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7aad0ad0-c976-429e-adb6-82f8246f3816 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 04:53:16 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:16.569 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[9086f6da-772b-460b-bdcf-6503bcb8e9cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:53:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.694 225317 INFO nova.virt.libvirt.driver [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Deleting instance files /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2_del#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.694 225317 INFO nova.virt.libvirt.driver [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Deletion of /var/lib/nova/instances/f8af2f33-afb8-40b5-8850-a24d410bdae2_del complete#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.744 225317 DEBUG nova.virt.libvirt.host [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.744 225317 INFO nova.virt.libvirt.host [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] UEFI support detected#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.745 225317 INFO nova.compute.manager [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.746 225317 DEBUG oslo.service.loopingcall [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.746 225317 DEBUG nova.compute.manager [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 04:53:16 np0005591762 nova_compute[225313]: 2026-01-22 09:53:16.747 225317 DEBUG nova.network.neutron [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 04:53:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:16.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:17 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.221 225317 DEBUG nova.network.neutron [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.234 225317 INFO nova.compute.manager [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Took 0.49 seconds to deallocate network for instance.#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.366 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.366 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.396 225317 DEBUG oslo_concurrency.processutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:53:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:17 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800ade0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:53:17 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3082366264' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.735 225317 DEBUG oslo_concurrency.processutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.339s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.739 225317 DEBUG nova.compute.provider_tree [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:53:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:17.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.750 225317 DEBUG nova.scheduler.client.report [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.765 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.781 225317 INFO nova.scheduler.client.report [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Deleted allocations for instance f8af2f33-afb8-40b5-8850-a24d410bdae2#033[00m
Jan 22 04:53:17 np0005591762 nova_compute[225313]: 2026-01-22 09:53:17.825 225317 DEBUG oslo_concurrency.lockutils [None req-d9e2c2a9-27c6-45ef-89dc-9e216d224049 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:18 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800ade0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.521 225317 DEBUG nova.compute.manager [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received event network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.522 225317 DEBUG oslo_concurrency.lockutils [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.522 225317 DEBUG oslo_concurrency.lockutils [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.522 225317 DEBUG oslo_concurrency.lockutils [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "f8af2f33-afb8-40b5-8850-a24d410bdae2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.522 225317 DEBUG nova.compute.manager [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] No waiting events found dispatching network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.523 225317 WARNING nova.compute.manager [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received unexpected event network-vif-plugged-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d for instance with vm_state deleted and task_state None.#033[00m
Jan 22 04:53:18 np0005591762 nova_compute[225313]: 2026-01-22 09:53:18.523 225317 DEBUG nova.compute.manager [req-fad2dcbe-eecd-420b-bd80-2ab3f57e9158 req-f5cef1fd-7a1b-460c-9bb6-92b6d744ae27 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Received event network-vif-deleted-31d9d6b3-c161-4ddb-8d18-682f61a7fd7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:53:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:18.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:19 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157ae80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:19 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:19 np0005591762 nova_compute[225313]: 2026-01-22 09:53:19.561 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:19 np0005591762 nova_compute[225313]: 2026-01-22 09:53:19.669 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:19.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:20 np0005591762 nova_compute[225313]: 2026-01-22 09:53:20.165 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:20 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800baf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:20.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:21 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800baf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:21 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:21 np0005591762 nova_compute[225313]: 2026-01-22 09:53:21.512 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:21.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:22 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:22.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:23 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:23 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800baf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:23.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:24 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:24.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:25 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:25 np0005591762 nova_compute[225313]: 2026-01-22 09:53:25.167 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:25 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:25.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:26 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800baf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:26 np0005591762 nova_compute[225313]: 2026-01-22 09:53:26.514 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:26.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:27 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:27 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:53:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:27.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:53:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:28 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:28.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:29 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:29 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:29.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:30 np0005591762 nova_compute[225313]: 2026-01-22 09:53:30.169 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:30 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095330 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:53:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:30 np0005591762 podman[228608]: 2026-01-22 09:53:30.819933721 +0000 UTC m=+0.035939050 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 04:53:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:30.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:31 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:31 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:31 np0005591762 nova_compute[225313]: 2026-01-22 09:53:31.493 225317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769075596.490798, f8af2f33-afb8-40b5-8850-a24d410bdae2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:53:31 np0005591762 nova_compute[225313]: 2026-01-22 09:53:31.493 225317 INFO nova.compute.manager [-] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] VM Stopped (Lifecycle Event)#033[00m
Jan 22 04:53:31 np0005591762 nova_compute[225313]: 2026-01-22 09:53:31.506 225317 DEBUG nova.compute.manager [None req-78b98ce0-62dd-4e2e-9a3f-db9e38510e30 - - - - - -] [instance: f8af2f33-afb8-40b5-8850-a24d410bdae2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:53:31 np0005591762 nova_compute[225313]: 2026-01-22 09:53:31.517 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:31.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:32 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:32.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:33 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:33 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:33.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:34 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c003140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:34.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:35 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:35 np0005591762 nova_compute[225313]: 2026-01-22 09:53:35.171 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:35 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:35.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:36 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:36 np0005591762 nova_compute[225313]: 2026-01-22 09:53:36.518 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:36.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:37 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:37 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:37.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:38 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:38.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:39 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c003a60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:39 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:53:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:39.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:53:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:39 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:53:39 np0005591762 podman[228659]: 2026-01-22 09:53:39.851132871 +0000 UTC m=+0.071395597 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:53:40 np0005591762 nova_compute[225313]: 2026-01-22 09:53:40.173 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:40 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:40.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:41 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:41 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c0013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:41 np0005591762 nova_compute[225313]: 2026-01-22 09:53:41.520 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:41.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:42 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:42 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:53:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:42 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:53:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:42 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:53:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:42.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:43 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84340039e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:43 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:43.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:44 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:44.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:45 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:45 np0005591762 nova_compute[225313]: 2026-01-22 09:53:45.175 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:45 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:45.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:45 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:53:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:46 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:46 np0005591762 nova_compute[225313]: 2026-01-22 09:53:46.522 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:46.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:47 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:47.195 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:53:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:47.196 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:53:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:53:47.196 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:53:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:47 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:47.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:48 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8434004ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:48.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:49 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8434004ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:49 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:49.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:50 np0005591762 nova_compute[225313]: 2026-01-22 09:53:50.177 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:50 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:50.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:51 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1328171443' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1328171443' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 04:53:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:51 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8434004ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:51 np0005591762 nova_compute[225313]: 2026-01-22 09:53:51.525 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 4100 writes, 22K keys, 4100 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 4100 writes, 4100 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1512 writes, 7218 keys, 1512 commit groups, 1.0 writes per commit group, ingest: 16.74 MB, 0.03 MB/s#012Interval WAL: 1512 writes, 1512 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    464.2      0.07              0.05        10    0.007       0      0       0.0       0.0#012  L6      1/0   11.99 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.4    544.5    456.9      0.25              0.17         9    0.028     43K   4907       0.0       0.0#012 Sum      1/0   11.99 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.4    421.3    458.6      0.32              0.22        19    0.017     43K   4907       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.7    427.3    429.9      0.15              0.10         8    0.018     23K   2546       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    544.5    456.9      0.25              0.17         9    0.028     43K   4907       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    471.8      0.07              0.05         9    0.008       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.033, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 0.3 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a025f49350#2 capacity: 304.00 MB usage: 8.40 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000124 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(492,8.03 MB,2.64246%) FilterBlock(19,125.42 KB,0.0402902%) IndexBlock(19,247.39 KB,0.0794712%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 22 04:53:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000020s ======
Jan 22 04:53:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:51.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000020s
Jan 22 04:53:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:52 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095352 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:53:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:52.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:53 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:53 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:53.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:54 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:54.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:55 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:55 np0005591762 nova_compute[225313]: 2026-01-22 09:53:55.179 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:55 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:55.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:56 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:56 np0005591762 nova_compute[225313]: 2026-01-22 09:53:56.528 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:53:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:53:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:56.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:57 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800cfe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:57 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84580039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:53:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:57.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:53:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:58 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c003a60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:53:58.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:53:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:53:59 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:53:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:53:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:53:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:53:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:53:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:53:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:53:59.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:00 np0005591762 nova_compute[225313]: 2026-01-22 09:54:00.181 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:00 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84580044e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:00.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:01 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c003c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:01 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:01 np0005591762 nova_compute[225313]: 2026-01-22 09:54:01.530 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:01.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:01 np0005591762 podman[228733]: 2026-01-22 09:54:01.826854749 +0000 UTC m=+0.048866405 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 04:54:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:02 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:54:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:02.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:54:02 np0005591762 ovn_controller[133622]: 2026-01-22T09:54:02Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 04:54:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:03 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84580044e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:03 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:54:03.079 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:54:03 np0005591762 nova_compute[225313]: 2026-01-22 09:54:03.080 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:03 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:54:03.081 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:03 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:54:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:03 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:03.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:04 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:04.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:05 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:05 np0005591762 nova_compute[225313]: 2026-01-22 09:54:05.182 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:05 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84580044e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:05.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:06 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:54:06.083 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:54:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:06 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:06 np0005591762 nova_compute[225313]: 2026-01-22 09:54:06.533 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:54:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:06.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:07 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:07 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:07.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:08 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458005970 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:08.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:54:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4196974331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:54:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:09 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:09 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:09.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:09 np0005591762 nova_compute[225313]: 2026-01-22 09:54:09.982 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.004 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.004 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.004 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.015 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.015 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.015 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.016 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.016 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.184 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:10 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 nova_compute[225313]: 2026-01-22 09:54:10.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:10 np0005591762 podman[228863]: 2026-01-22 09:54:10.852233174 +0000 UTC m=+0.073133715 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:54:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:54:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:10.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:54:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:11 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458005970 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:11 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:11 np0005591762 nova_compute[225313]: 2026-01-22 09:54:11.535 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:11.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:12 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:54:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:12.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:54:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:13 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:13 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458005970 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:13 np0005591762 nova_compute[225313]: 2026-01-22 09:54:13.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:13.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:14 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:14.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:15 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.186 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:15 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.740 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.740 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.740 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.740 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:54:15 np0005591762 nova_compute[225313]: 2026-01-22 09:54:15.741 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:54:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:15.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:54:16 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/401078288' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.085 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.345s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.290 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.291 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4986MB free_disk=59.942726135253906GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.292 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.292 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.349 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.349 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.365 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:54:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:16 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458006a70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.537 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.710 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.345s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.714 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.727 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.746 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:54:16 np0005591762 nova_compute[225313]: 2026-01-22 09:54:16.746 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:54:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 04:54:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 7997 writes, 31K keys, 7997 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7997 writes, 1974 syncs, 4.05 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2079 writes, 6139 keys, 2079 commit groups, 1.0 writes per commit group, ingest: 6.38 MB, 0.01 MB/s#012Interval WAL: 2079 writes, 934 syncs, 2.23 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memta
Jan 22 04:54:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:16.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:17 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:17 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:17.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:18 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:18.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:19 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458006a70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:19 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:19.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:20 np0005591762 nova_compute[225313]: 2026-01-22 09:54:20.189 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:20 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:20.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:21 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:21 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:21 np0005591762 nova_compute[225313]: 2026-01-22 09:54:21.538 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:21.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:22 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x55832157d5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:22.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:23 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:23 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458006a70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:23.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:24 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:24.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:25 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:25 np0005591762 nova_compute[225313]: 2026-01-22 09:54:25.191 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:25 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:25.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:26 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458006a70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:26 np0005591762 nova_compute[225313]: 2026-01-22 09:54:26.540 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:26.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:27 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:27 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:27.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:28 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:28.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:29 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:29 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8458006a70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:29.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:30 np0005591762 nova_compute[225313]: 2026-01-22 09:54:30.193 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:30 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844c004d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:30.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:31 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f843c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:31 np0005591762 kernel: ganesha.nfsd[228224]: segfault at 50 ip 00007f84cae8132e sp 00007f8452ffc210 error 4 in libntirpc.so.5.8[7f84cae66000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 22 04:54:31 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:54:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[227637]: 22/01/2026 09:54:31 : epoch 6971f36e : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f844800a090 fd 39 proxy ignored for local
Jan 22 04:54:31 np0005591762 nova_compute[225313]: 2026-01-22 09:54:31.541 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:31 np0005591762 systemd[1]: Started Process Core Dump (PID 228980/UID 0).
Jan 22 04:54:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:31.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:32 np0005591762 podman[229006]: 2026-01-22 09:54:32.349066697 +0000 UTC m=+0.038844988 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:54:32 np0005591762 systemd-coredump[228981]: Process 227641 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007f84cae8132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:54:32 np0005591762 systemd[1]: systemd-coredump@12-228980-0.service: Deactivated successfully.
Jan 22 04:54:32 np0005591762 systemd[1]: systemd-coredump@12-228980-0.service: Consumed 1.016s CPU time.
Jan 22 04:54:32 np0005591762 podman[229028]: 2026-01-22 09:54:32.645757021 +0000 UTC m=+0.018109319 container died dd55f711f8f4139a8a8080e47203faf95f9e02060357ea3757e21900dc2da547 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:54:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:32 np0005591762 systemd[1]: var-lib-containers-storage-overlay-65809947feb3d6d68cf940cbf4e052dccef5720549d487a3b065a2a89383a674-merged.mount: Deactivated successfully.
Jan 22 04:54:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:32 np0005591762 podman[229028]: 2026-01-22 09:54:32.666210057 +0000 UTC m=+0.038562355 container remove dd55f711f8f4139a8a8080e47203faf95f9e02060357ea3757e21900dc2da547 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 22 04:54:32 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:54:32 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:54:32 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.062s CPU time.
Jan 22 04:54:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:32.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:33.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:34.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:35 np0005591762 nova_compute[225313]: 2026-01-22 09:54:35.193 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:35.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:36 np0005591762 nova_compute[225313]: 2026-01-22 09:54:36.543 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:36.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095437 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:54:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:37.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:38.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:39.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:40 np0005591762 nova_compute[225313]: 2026-01-22 09:54:40.194 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:40.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:41 np0005591762 nova_compute[225313]: 2026-01-22 09:54:41.545 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:41 np0005591762 podman[229072]: 2026-01-22 09:54:41.83605563 +0000 UTC m=+0.059151249 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 04:54:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:41.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:42.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:43 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 13.
Jan 22 04:54:43 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:54:43 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Consumed 1.062s CPU time.
Jan 22 04:54:43 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:54:43 np0005591762 podman[229134]: 2026-01-22 09:54:43.162890507 +0000 UTC m=+0.026976930 container create 061622a1b4d6675a526b61782a00c26b035987b269951c3276ef886ec7a0a66f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 22 04:54:43 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a385a77c53e79455c13ee316f3081f35518fa978de206be05866876caef478/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:54:43 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a385a77c53e79455c13ee316f3081f35518fa978de206be05866876caef478/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:54:43 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a385a77c53e79455c13ee316f3081f35518fa978de206be05866876caef478/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:54:43 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a385a77c53e79455c13ee316f3081f35518fa978de206be05866876caef478/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:54:43 np0005591762 podman[229134]: 2026-01-22 09:54:43.210809215 +0000 UTC m=+0.074895648 container init 061622a1b4d6675a526b61782a00c26b035987b269951c3276ef886ec7a0a66f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 22 04:54:43 np0005591762 podman[229134]: 2026-01-22 09:54:43.214581097 +0000 UTC m=+0.078667520 container start 061622a1b4d6675a526b61782a00c26b035987b269951c3276ef886ec7a0a66f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 22 04:54:43 np0005591762 bash[229134]: 061622a1b4d6675a526b61782a00c26b035987b269951c3276ef886ec7a0a66f
Jan 22 04:54:43 np0005591762 podman[229134]: 2026-01-22 09:54:43.151839879 +0000 UTC m=+0.015926323 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:54:43 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:43 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:43.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:44.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:45 np0005591762 nova_compute[225313]: 2026-01-22 09:54:45.195 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:45.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:46 np0005591762 nova_compute[225313]: 2026-01-22 09:54:46.547 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:46.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:54:47.196 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:54:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:54:47.197 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:54:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:54:47.197 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:54:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:47.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:48.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:49 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:54:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:49 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:54:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:49.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:50 np0005591762 nova_compute[225313]: 2026-01-22 09:54:50.198 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:50.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:51 np0005591762 nova_compute[225313]: 2026-01-22 09:54:51.549 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:51.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:52.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:53.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:54.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:55 np0005591762 nova_compute[225313]: 2026-01-22 09:54:55.201 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:55 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:54:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:55.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:54:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:56 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1edc002ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:56 np0005591762 nova_compute[225313]: 2026-01-22 09:54:56.551 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:54:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:54:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:56.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:57 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee0001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095457 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:54:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:57 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:57.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:58 np0005591762 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 22 04:54:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:58 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee8001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:54:58.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:54:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:59 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1edc0039d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:54:59 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1edc0039d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:54:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:54:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:54:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:54:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:54:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:54:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:54:59.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:00 np0005591762 nova_compute[225313]: 2026-01-22 09:55:00.204 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:00 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:00.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:01 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee8001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:01 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1edc0039d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:01 np0005591762 nova_compute[225313]: 2026-01-22 09:55:01.551 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:01.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:02 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1edc0039d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:02 np0005591762 podman[229249]: 2026-01-22 09:55:02.813076127 +0000 UTC m=+0.036682354 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 04:55:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:02.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:03 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:03 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:03.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:04 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:04.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:05 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:05.179 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:55:05 np0005591762 nova_compute[225313]: 2026-01-22 09:55:05.179 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:05.180 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:55:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:05.180 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:05 np0005591762 nova_compute[225313]: 2026-01-22 09:55:05.206 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:05 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:05.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:06 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:06 np0005591762 nova_compute[225313]: 2026-01-22 09:55:06.553 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:55:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:55:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:06.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:07 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee0002880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:07 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:55:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:55:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:55:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:55:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:07.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:08 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:08.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:09 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1eec002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:09 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee0002880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:09.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:55:10 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3750233027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.207 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:10 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.746 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.747 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.747 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.747 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.758 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.758 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.758 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.758 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:10 np0005591762 nova_compute[225313]: 2026-01-22 09:55:10.758 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:55:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:55:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:55:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:10.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:11 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:11 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1edc005250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:11 np0005591762 nova_compute[225313]: 2026-01-22 09:55:11.555 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:11 np0005591762 nova_compute[225313]: 2026-01-22 09:55:11.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:11.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:12 np0005591762 podman[229469]: 2026-01-22 09:55:12.491089701 +0000 UTC m=+0.057780674 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 04:55:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:12 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee00039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:12 np0005591762 nova_compute[225313]: 2026-01-22 09:55:12.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:12.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:13 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee00039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:13 np0005591762 kernel: ganesha.nfsd[229226]: segfault at 50 ip 00007f1f743fc32e sp 00007f1efb7fd210 error 4 in libntirpc.so.5.8[7f1f743e1000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 22 04:55:13 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:55:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229146]: 22/01/2026 09:55:13 : epoch 6971f3e3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ee8009ec0 fd 38 proxy ignored for local
Jan 22 04:55:13 np0005591762 systemd[1]: Started Process Core Dump (PID 229496/UID 0).
Jan 22 04:55:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:13.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:14 np0005591762 systemd-coredump[229497]: Process 229150 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007f1f743fc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:55:14 np0005591762 systemd[1]: systemd-coredump@13-229496-0.service: Deactivated successfully.
Jan 22 04:55:14 np0005591762 podman[229503]: 2026-01-22 09:55:14.649200194 +0000 UTC m=+0.017907996 container died 061622a1b4d6675a526b61782a00c26b035987b269951c3276ef886ec7a0a66f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 04:55:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:14 np0005591762 systemd[1]: var-lib-containers-storage-overlay-f9a385a77c53e79455c13ee316f3081f35518fa978de206be05866876caef478-merged.mount: Deactivated successfully.
Jan 22 04:55:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:14 np0005591762 podman[229503]: 2026-01-22 09:55:14.669490221 +0000 UTC m=+0.038198012 container remove 061622a1b4d6675a526b61782a00c26b035987b269951c3276ef886ec7a0a66f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 22 04:55:14 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:55:14 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:55:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:14.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:15 np0005591762 nova_compute[225313]: 2026-01-22 09:55:15.208 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:15 np0005591762 nova_compute[225313]: 2026-01-22 09:55:15.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:15.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:16 np0005591762 nova_compute[225313]: 2026-01-22 09:55:16.555 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:16.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:17 np0005591762 nova_compute[225313]: 2026-01-22 09:55:17.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:55:17 np0005591762 nova_compute[225313]: 2026-01-22 09:55:17.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:17 np0005591762 nova_compute[225313]: 2026-01-22 09:55:17.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:17 np0005591762 nova_compute[225313]: 2026-01-22 09:55:17.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:17 np0005591762 nova_compute[225313]: 2026-01-22 09:55:17.739 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:55:17 np0005591762 nova_compute[225313]: 2026-01-22 09:55:17.739 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:17.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:18 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:55:18 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3478475163' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.081 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.279 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.280 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4939MB free_disk=59.92180633544922GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.280 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.281 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.322 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.322 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.334 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:18 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:55:18 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/752552469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.688 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.353s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.693 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.706 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.708 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:55:18 np0005591762 nova_compute[225313]: 2026-01-22 09:55:18.709 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.428s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:18.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095519 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:55:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:19.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:20 np0005591762 nova_compute[225313]: 2026-01-22 09:55:20.210 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:20.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:21 np0005591762 nova_compute[225313]: 2026-01-22 09:55:21.557 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:21.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:22.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:24.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:25 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 14.
Jan 22 04:55:25 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:55:25 np0005591762 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2...
Jan 22 04:55:25 np0005591762 podman[229629]: 2026-01-22 09:55:25.181841741 +0000 UTC m=+0.031607572 container create b3d647a2fbd692d6856c9a7341926ba7ec21afb4ba28dc729f684510df6a326a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 04:55:25 np0005591762 nova_compute[225313]: 2026-01-22 09:55:25.211 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:25 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c9a323b672635eca961bab8d8984b42e0ae0980bd14fb676b083fdb6c54f63/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 22 04:55:25 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c9a323b672635eca961bab8d8984b42e0ae0980bd14fb676b083fdb6c54f63/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:55:25 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c9a323b672635eca961bab8d8984b42e0ae0980bd14fb676b083fdb6c54f63/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 04:55:25 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c9a323b672635eca961bab8d8984b42e0ae0980bd14fb676b083fdb6c54f63/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.qniaxp-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 22 04:55:25 np0005591762 podman[229629]: 2026-01-22 09:55:25.224364057 +0000 UTC m=+0.074129888 container init b3d647a2fbd692d6856c9a7341926ba7ec21afb4ba28dc729f684510df6a326a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 04:55:25 np0005591762 podman[229629]: 2026-01-22 09:55:25.228711616 +0000 UTC m=+0.078477447 container start b3d647a2fbd692d6856c9a7341926ba7ec21afb4ba28dc729f684510df6a326a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 22 04:55:25 np0005591762 bash[229629]: b3d647a2fbd692d6856c9a7341926ba7ec21afb4ba28dc729f684510df6a326a
Jan 22 04:55:25 np0005591762 podman[229629]: 2026-01-22 09:55:25.169297933 +0000 UTC m=+0.019063765 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 04:55:25 np0005591762 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:25 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:25.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:26 np0005591762 nova_compute[225313]: 2026-01-22 09:55:26.559 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:26.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095527 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:55:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:27.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:28.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:29.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:30 np0005591762 nova_compute[225313]: 2026-01-22 09:55:30.213 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:30.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:31 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:31 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:31 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 22 04:55:31 np0005591762 nova_compute[225313]: 2026-01-22 09:55:31.561 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:31 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:31 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:55:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:31 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:55:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:31.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:32.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:33 np0005591762 podman[229716]: 2026-01-22 09:55:33.817507478 +0000 UTC m=+0.040026690 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 04:55:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:33.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:34.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:35 np0005591762 nova_compute[225313]: 2026-01-22 09:55:35.215 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:35.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:36 np0005591762 nova_compute[225313]: 2026-01-22 09:55:36.564 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:36.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 22 04:55:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:37.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:37 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 22 04:55:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:38 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe834000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:39.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:39 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe830001ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:39 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:39.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:40 np0005591762 nova_compute[225313]: 2026-01-22 09:55:40.217 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:40 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe834001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:40 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 22 04:55:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:40 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 22 04:55:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:41.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:41 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe830002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:41 np0005591762 nova_compute[225313]: 2026-01-22 09:55:41.566 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095541 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:55:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:41 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe830002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:41.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:42 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:42 np0005591762 podman[229757]: 2026-01-22 09:55:42.867029982 +0000 UTC m=+0.089890161 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:55:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:43.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:43 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:43 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe830002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:43.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:43 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.330701) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075744330746, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2370, "num_deletes": 251, "total_data_size": 6225662, "memory_usage": 6310912, "flush_reason": "Manual Compaction"}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075744339570, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4039220, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21081, "largest_seqno": 23446, "table_properties": {"data_size": 4029739, "index_size": 5911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19882, "raw_average_key_size": 20, "raw_value_size": 4010563, "raw_average_value_size": 4096, "num_data_blocks": 258, "num_entries": 979, "num_filter_entries": 979, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769075537, "oldest_key_time": 1769075537, "file_creation_time": 1769075744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8894 microseconds, and 6368 cpu microseconds.
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.339601) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4039220 bytes OK
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.339615) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.341250) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.341261) EVENT_LOG_v1 {"time_micros": 1769075744341258, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.341275) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6215229, prev total WAL file size 6215229, number of live WAL files 2.
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.342145) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3944KB)], [39(11MB)]
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075744342175, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 16607029, "oldest_snapshot_seqno": -1}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5548 keys, 14457011 bytes, temperature: kUnknown
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075744374357, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 14457011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14418456, "index_size": 23597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13893, "raw_key_size": 140029, "raw_average_key_size": 25, "raw_value_size": 14316431, "raw_average_value_size": 2580, "num_data_blocks": 972, "num_entries": 5548, "num_filter_entries": 5548, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769075744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.374510) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 14457011 bytes
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.374924) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 515.6 rd, 448.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.0 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 6072, records dropped: 524 output_compression: NoCompression
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.374937) EVENT_LOG_v1 {"time_micros": 1769075744374931, "job": 22, "event": "compaction_finished", "compaction_time_micros": 32212, "compaction_time_cpu_micros": 20540, "output_level": 6, "num_output_files": 1, "total_output_size": 14457011, "num_input_records": 6072, "num_output_records": 5548, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075744375446, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075744376738, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.342092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.376859) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.376863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.376864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.376866) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:55:44 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:55:44.376867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:55:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:44 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe830002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:45.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:45 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:45 np0005591762 nova_compute[225313]: 2026-01-22 09:55:45.220 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:45 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:45.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:46 np0005591762 nova_compute[225313]: 2026-01-22 09:55:46.569 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:46 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8300041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:47.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:47 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8300041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:47.198 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:47.199 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:47.199 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095547 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 22 04:55:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:47 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:47.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:48 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe82c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:49.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:49 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8300041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:49 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8300041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:49.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:49 np0005591762 nova_compute[225313]: 2026-01-22 09:55:49.992 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:49 np0005591762 nova_compute[225313]: 2026-01-22 09:55:49.992 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.005 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.061 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.062 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.066 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.066 225317 INFO nova.compute.claims [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.133 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.222 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:50 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:55:50 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2096890528' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.479 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.483 225317 DEBUG nova.compute.provider_tree [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.498 225317 DEBUG nova.scheduler.client.report [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.514 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.514 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.552 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.553 225317 DEBUG nova.network.neutron [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.564 225317 INFO nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.576 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 04:55:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:50 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8300041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.647 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.648 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.648 225317 INFO nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Creating image(s)#033[00m
Jan 22 04:55:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.667 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.688 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.708 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.711 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.757 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.758 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "9db187949728ea707722fd244d769f131efa8688" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.759 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.759 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.778 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.781 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.922 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.946 225317 DEBUG nova.policy [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4428dd9b0fb64c25b8f33b0050d4ef6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 04:55:50 np0005591762 nova_compute[225313]: 2026-01-22 09:55:50.972 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] resizing rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 22 04:55:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:51.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.034 225317 DEBUG nova.objects.instance [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'migration_context' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.053 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.053 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Ensure instance console log exists: /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.053 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.054 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.054 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:51 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe834008e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.421 225317 DEBUG nova.network.neutron [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Successfully created port: 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 04:55:51 np0005591762 nova_compute[225313]: 2026-01-22 09:55:51.570 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:51 np0005591762 kernel: ganesha.nfsd[229739]: segfault at 50 ip 00007fe8bc4ea32e sp 00007fe83affc210 error 4 in libntirpc.so.5.8[7fe8bc4cf000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 22 04:55:51 np0005591762 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 22 04:55:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp[229641]: 22/01/2026 09:55:51 : epoch 6971f40d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8300041f0 fd 38 proxy ignored for local
Jan 22 04:55:51 np0005591762 systemd[1]: Started Process Core Dump (PID 229977/UID 0).
Jan 22 04:55:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000021s ======
Jan 22 04:55:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:51.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000021s
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.144 225317 DEBUG nova.network.neutron [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Successfully updated port: 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.165 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.165 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.165 225317 DEBUG nova.network.neutron [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.230 225317 DEBUG nova.compute.manager [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-changed-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.230 225317 DEBUG nova.compute.manager [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing instance network info cache due to event network-changed-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.231 225317 DEBUG oslo_concurrency.lockutils [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.270 225317 DEBUG nova.network.neutron [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 04:55:52 np0005591762 systemd-coredump[229978]: Process 229645 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007fe8bc4ea32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 22 04:55:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:52 np0005591762 systemd[1]: systemd-coredump@14-229977-0.service: Deactivated successfully.
Jan 22 04:55:52 np0005591762 systemd[1]: systemd-coredump@14-229977-0.service: Consumed 1.027s CPU time.
Jan 22 04:55:52 np0005591762 podman[230009]: 2026-01-22 09:55:52.735450211 +0000 UTC m=+0.017627477 container died b3d647a2fbd692d6856c9a7341926ba7ec21afb4ba28dc729f684510df6a326a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 22 04:55:52 np0005591762 systemd[1]: var-lib-containers-storage-overlay-64c9a323b672635eca961bab8d8984b42e0ae0980bd14fb676b083fdb6c54f63-merged.mount: Deactivated successfully.
Jan 22 04:55:52 np0005591762 podman[230009]: 2026-01-22 09:55:52.753198183 +0000 UTC m=+0.035375429 container remove b3d647a2fbd692d6856c9a7341926ba7ec21afb4ba28dc729f684510df6a326a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-nfs-cephfs-1-0-compute-2-qniaxp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 22 04:55:52 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Main process exited, code=exited, status=139/n/a
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.759 225317 DEBUG nova.network.neutron [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.772 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.772 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Instance network_info: |[{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.772 225317 DEBUG oslo_concurrency.lockutils [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.772 225317 DEBUG nova.network.neutron [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing network info cache for port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.774 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Start _get_guest_xml network_info=[{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'encryption_options': None, 'image_id': 'bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.778 225317 WARNING nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.781 225317 DEBUG nova.virt.libvirt.host [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.781 225317 DEBUG nova.virt.libvirt.host [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.786 225317 DEBUG nova.virt.libvirt.host [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.786 225317 DEBUG nova.virt.libvirt.host [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.786 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.786 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T09:51:32Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6eff66ba-fb3e-4ca7-b05b-920b01d9affd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.787 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.787 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.787 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.787 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.787 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.787 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.788 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.788 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.788 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.788 225317 DEBUG nova.virt.hardware [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 04:55:52 np0005591762 nova_compute[225313]: 2026-01-22 09:55:52.790 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:52 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:55:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:53.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:53 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 04:55:53 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3192174577' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.148 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.358s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.165 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.168 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:53 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 04:55:53 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1046101136' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.524 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.356s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.525 225317 DEBUG nova.virt.libvirt.vif [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:55:50Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.526 225317 DEBUG nova.network.os_vif_util [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.526 225317 DEBUG nova.network.os_vif_util [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.528 225317 DEBUG nova.objects.instance [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.533 225317 DEBUG nova.network.neutron [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updated VIF entry in instance network info cache for port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.534 225317 DEBUG nova.network.neutron [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:55:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.716 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] End _get_guest_xml xml=<domain type="kvm">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <uuid>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</uuid>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <name>instance-00000006</name>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <memory>131072</memory>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <vcpu>1</vcpu>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:creationTime>2026-01-22 09:55:52</nova:creationTime>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:flavor name="m1.nano">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:memory>128</nova:memory>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:disk>1</nova:disk>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:swap>0</nova:swap>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:vcpus>1</nova:vcpus>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </nova:flavor>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:owner>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </nova:owner>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <nova:ports>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        </nova:port>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </nova:ports>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </nova:instance>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <sysinfo type="smbios">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <entry name="manufacturer">RDO</entry>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <entry name="product">OpenStack Compute</entry>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <entry name="serial">c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <entry name="uuid">c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <entry name="family">Virtual Machine</entry>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <boot dev="hd"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <smbios mode="sysinfo"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <vmcoreinfo/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <clock offset="utc">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <timer name="hpet" present="no"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <cpu mode="host-model" match="exact">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <disk type="network" device="disk">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <target dev="vda" bus="virtio"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <disk type="network" device="cdrom">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <target dev="sda" bus="sata"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <interface type="ethernet">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <mac address="fa:16:3e:58:5d:ec"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <mtu size="1442"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <target dev="tap2c18aeb2-0a"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <serial type="pty">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <log file="/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log" append="off"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <input type="tablet" bus="usb"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <rng model="virtio">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <backend model="random">/dev/urandom</backend>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <controller type="usb" index="0"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    <memballoon model="virtio">
Jan 22 04:55:53 np0005591762 nova_compute[225313]:      <stats period="10"/>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:55:53 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:55:53 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:55:53 np0005591762 nova_compute[225313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.717 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Preparing to wait for external event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.718 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.718 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.718 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.719 225317 DEBUG nova.virt.libvirt.vif [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:55:50Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.719 225317 DEBUG nova.network.os_vif_util [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.719 225317 DEBUG nova.network.os_vif_util [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.720 225317 DEBUG os_vif [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.721 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.721 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.721 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.722 225317 DEBUG oslo_concurrency.lockutils [req-7b1bc4ba-2ffc-474c-96c1-5b6fa084352c req-9a8f7aed-c91d-47f8-937f-4ea322f79d6f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.724 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.724 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c18aeb2-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.725 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c18aeb2-0a, col_values=(('external_ids', {'iface-id': '2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:5d:ec', 'vm-uuid': 'c2a740a7-21a6-42d9-9b2f-8ba5143e0cec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.726 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:53 np0005591762 NetworkManager[48910]: <info>  [1769075753.7270] manager: (tap2c18aeb2-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.728 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.731 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.732 225317 INFO os_vif [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a')#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.763 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.763 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.764 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:58:5d:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.764 225317 INFO nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Using config drive#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.782 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:53.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.994 225317 INFO nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Creating config drive at /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/disk.config#033[00m
Jan 22 04:55:53 np0005591762 nova_compute[225313]: 2026-01-22 09:55:53.998 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp66geexn8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.116 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp66geexn8" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.135 225317 DEBUG nova.storage.rbd_utils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.137 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/disk.config c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.221 225317 DEBUG oslo_concurrency.processutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/disk.config c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.221 225317 INFO nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Deleting local config drive /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/disk.config because it was imported into RBD.#033[00m
Jan 22 04:55:54 np0005591762 systemd[1]: Starting libvirt secret daemon...
Jan 22 04:55:54 np0005591762 systemd[1]: Started libvirt secret daemon.
Jan 22 04:55:54 np0005591762 kernel: tap2c18aeb2-0a: entered promiscuous mode
Jan 22 04:55:54 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:54Z|00037|binding|INFO|Claiming lport 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f for this chassis.
Jan 22 04:55:54 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:54Z|00038|binding|INFO|2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f: Claiming fa:16:3e:58:5d:ec 10.100.0.5
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.290 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.292 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 NetworkManager[48910]: <info>  [1769075754.2942] manager: (tap2c18aeb2-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.299 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:5d:ec 10.100.0.5'], port_security=['fa:16:3e:58:5d:ec 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c2a740a7-21a6-42d9-9b2f-8ba5143e0cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac6bd202-9ce4-49d7-a7dd-c8aca89509c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3e6744-68b4-4126-a841-767d678dbcb8, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.300 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f in datapath f3e7c2ec-12ff-4a29-aade-135175be50e3 bound to our chassis#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.302 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f3e7c2ec-12ff-4a29-aade-135175be50e3#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.310 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e4570d82-29e9-41cf-a8a0-61b85387d457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.311 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf3e7c2ec-11 in ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.312 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf3e7c2ec-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.312 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[89feaf9e-2523-47b0-8612-a1508ece9f35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.313 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[b082366f-936d-4320-8be4-a845f0d9cc23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.324 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[d5cb903a-1704-4b21-838e-1b3512a86028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 systemd-machined[193990]: New machine qemu-2-instance-00000006.
Jan 22 04:55:54 np0005591762 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Jan 22 04:55:54 np0005591762 systemd-udevd[230199]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.345 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1de60234-042f-4c4b-80f6-a6996d62e4d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 NetworkManager[48910]: <info>  [1769075754.3611] device (tap2c18aeb2-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.361 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 NetworkManager[48910]: <info>  [1769075754.3626] device (tap2c18aeb2-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 04:55:54 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:54Z|00039|binding|INFO|Setting lport 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f ovn-installed in OVS
Jan 22 04:55:54 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:54Z|00040|binding|INFO|Setting lport 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f up in Southbound
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.369 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[f00128b2-77d6-4d94-8d31-60032fedcb78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.370 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.375 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[c122f870-bc94-444f-b2a8-b53b84495b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 NetworkManager[48910]: <info>  [1769075754.3763] manager: (tapf3e7c2ec-10): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.400 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7fa500-c86b-4c3f-bee6-fa60fd691bb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.402 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[9e51e117-361e-436a-993f-04a26432e556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 NetworkManager[48910]: <info>  [1769075754.4185] device (tapf3e7c2ec-10): carrier: link connected
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.421 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[4966c048-d27f-420d-b1bd-8d217fb307c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.433 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c3fa2e-b58d-4078-aa7d-e3d91c444a9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3e7c2ec-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:b1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333197, 'reachable_time': 44428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230221, 'error': None, 'target': 'ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.445 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1991a1d0-1653-4080-b5d6-db91802a0536]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:b102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333197, 'tstamp': 333197}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230222, 'error': None, 'target': 'ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.456 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[5e619860-8301-4107-880c-6b415f103a17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3e7c2ec-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:b1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333197, 'reachable_time': 44428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230223, 'error': None, 'target': 'ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.480 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[f27a4bdc-5b76-4fdf-b3ae-69ff16c9198d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.510 225317 DEBUG nova.compute.manager [req-9bb4fabc-56e2-4515-aede-28dfafaa9274 req-c45d8b5e-f71b-458a-8ba8-cab3bb52c169 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.510 225317 DEBUG oslo_concurrency.lockutils [req-9bb4fabc-56e2-4515-aede-28dfafaa9274 req-c45d8b5e-f71b-458a-8ba8-cab3bb52c169 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.510 225317 DEBUG oslo_concurrency.lockutils [req-9bb4fabc-56e2-4515-aede-28dfafaa9274 req-c45d8b5e-f71b-458a-8ba8-cab3bb52c169 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.510 225317 DEBUG oslo_concurrency.lockutils [req-9bb4fabc-56e2-4515-aede-28dfafaa9274 req-c45d8b5e-f71b-458a-8ba8-cab3bb52c169 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.511 225317 DEBUG nova.compute.manager [req-9bb4fabc-56e2-4515-aede-28dfafaa9274 req-c45d8b5e-f71b-458a-8ba8-cab3bb52c169 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Processing event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.519 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1024f9a1-e4f1-42ea-b81b-33d68bf767f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.520 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3e7c2ec-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.520 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.521 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3e7c2ec-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:54 np0005591762 kernel: tapf3e7c2ec-10: entered promiscuous mode
Jan 22 04:55:54 np0005591762 NetworkManager[48910]: <info>  [1769075754.5238] manager: (tapf3e7c2ec-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.522 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.525 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf3e7c2ec-10, col_values=(('external_ids', {'iface-id': '1240fd1c-f79e-4e91-8b0f-3356e4c99edb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.526 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.527 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:54Z|00041|binding|INFO|Releasing lport 1240fd1c-f79e-4e91-8b0f-3356e4c99edb from this chassis (sb_readonly=0)
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.528 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f3e7c2ec-12ff-4a29-aade-135175be50e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f3e7c2ec-12ff-4a29-aade-135175be50e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.536 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e43c9778-91fc-4e0b-92f5-04ad9d5b3ed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.537 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-f3e7c2ec-12ff-4a29-aade-135175be50e3
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/f3e7c2ec-12ff-4a29-aade-135175be50e3.pid.haproxy
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID f3e7c2ec-12ff-4a29-aade-135175be50e3
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 04:55:54 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:55:54.539 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'env', 'PROCESS_TAG=haproxy-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f3e7c2ec-12ff-4a29-aade-135175be50e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 04:55:54 np0005591762 nova_compute[225313]: 2026-01-22 09:55:54.540 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:54 np0005591762 podman[230254]: 2026-01-22 09:55:54.823499179 +0000 UTC m=+0.032201713 container create 08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:55:54 np0005591762 systemd[1]: Started libpod-conmon-08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1.scope.
Jan 22 04:55:54 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:55:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4db7cbdd87946036038f8b20dd520bda04ea47623dd8d0f428e73fb9a69fad5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:55:54 np0005591762 podman[230254]: 2026-01-22 09:55:54.873400511 +0000 UTC m=+0.082103055 container init 08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 04:55:54 np0005591762 podman[230254]: 2026-01-22 09:55:54.878485292 +0000 UTC m=+0.087187816 container start 08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:55:54 np0005591762 podman[230254]: 2026-01-22 09:55:54.808908281 +0000 UTC m=+0.017610824 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:55:54 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [NOTICE]   (230270) : New worker (230272) forked
Jan 22 04:55:54 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [NOTICE]   (230270) : Loading success.
Jan 22 04:55:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:55.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.027 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075755.0274568, c2a740a7-21a6-42d9-9b2f-8ba5143e0cec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.028 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] VM Started (Lifecycle Event)#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.029 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.031 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.033 225317 INFO nova.virt.libvirt.driver [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Instance spawned successfully.#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.033 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.046 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.049 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.051 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.052 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.052 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.052 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.053 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.053 225317 DEBUG nova.virt.libvirt.driver [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.126 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.126 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075755.0282438, c2a740a7-21a6-42d9-9b2f-8ba5143e0cec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.127 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] VM Paused (Lifecycle Event)#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.151 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.152 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075755.0310714, c2a740a7-21a6-42d9-9b2f-8ba5143e0cec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.153 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] VM Resumed (Lifecycle Event)#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.173 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.174 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.180 225317 INFO nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Took 4.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.181 225317 DEBUG nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.189 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.222 225317 INFO nova.compute.manager [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Took 5.18 seconds to build instance.#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.224 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:55 np0005591762 nova_compute[225313]: 2026-01-22 09:55:55.230 225317 DEBUG oslo_concurrency.lockutils [None req-8e6e505c-4bc3-4fc9-a5aa-9fe04271ef9d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:55.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:56 np0005591762 nova_compute[225313]: 2026-01-22 09:55:56.568 225317 DEBUG nova.compute.manager [req-b8e6002d-87f4-4b9d-ab45-3331159ee4e2 req-abecdeea-e210-419c-93d8-f033d51127a9 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:55:56 np0005591762 nova_compute[225313]: 2026-01-22 09:55:56.568 225317 DEBUG oslo_concurrency.lockutils [req-b8e6002d-87f4-4b9d-ab45-3331159ee4e2 req-abecdeea-e210-419c-93d8-f033d51127a9 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:55:56 np0005591762 nova_compute[225313]: 2026-01-22 09:55:56.568 225317 DEBUG oslo_concurrency.lockutils [req-b8e6002d-87f4-4b9d-ab45-3331159ee4e2 req-abecdeea-e210-419c-93d8-f033d51127a9 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:55:56 np0005591762 nova_compute[225313]: 2026-01-22 09:55:56.568 225317 DEBUG oslo_concurrency.lockutils [req-b8e6002d-87f4-4b9d-ab45-3331159ee4e2 req-abecdeea-e210-419c-93d8-f033d51127a9 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:55:56 np0005591762 nova_compute[225313]: 2026-01-22 09:55:56.569 225317 DEBUG nova.compute.manager [req-b8e6002d-87f4-4b9d-ab45-3331159ee4e2 req-abecdeea-e210-419c-93d8-f033d51127a9 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:55:56 np0005591762 nova_compute[225313]: 2026-01-22 09:55:56.569 225317 WARNING nova.compute.manager [req-b8e6002d-87f4-4b9d-ab45-3331159ee4e2 req-abecdeea-e210-419c-93d8-f033d51127a9 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f for instance with vm_state active and task_state None.#033[00m
Jan 22 04:55:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:55:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:57.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095557 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:55:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:55:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:57.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:55:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:58 np0005591762 nova_compute[225313]: 2026-01-22 09:55:58.728 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:55:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:55:59.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:55:59 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:59Z|00042|binding|INFO|Releasing lport 1240fd1c-f79e-4e91-8b0f-3356e4c99edb from this chassis (sb_readonly=0)
Jan 22 04:55:59 np0005591762 NetworkManager[48910]: <info>  [1769075759.3373] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 22 04:55:59 np0005591762 NetworkManager[48910]: <info>  [1769075759.3380] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.335 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.371 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:59 np0005591762 ovn_controller[133622]: 2026-01-22T09:55:59Z|00043|binding|INFO|Releasing lport 1240fd1c-f79e-4e91-8b0f-3356e4c99edb from this chassis (sb_readonly=0)
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.375 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.629 225317 DEBUG nova.compute.manager [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-changed-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.629 225317 DEBUG nova.compute.manager [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing instance network info cache due to event network-changed-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.629 225317 DEBUG oslo_concurrency.lockutils [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.630 225317 DEBUG oslo_concurrency.lockutils [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:55:59 np0005591762 nova_compute[225313]: 2026-01-22 09:55:59.630 225317 DEBUG nova.network.neutron [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing network info cache for port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:55:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:55:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:55:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:55:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:55:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:55:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:55:59.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:00 np0005591762 nova_compute[225313]: 2026-01-22 09:56:00.225 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:01.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:01 np0005591762 nova_compute[225313]: 2026-01-22 09:56:01.153 225317 DEBUG nova.network.neutron [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updated VIF entry in instance network info cache for port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:56:01 np0005591762 nova_compute[225313]: 2026-01-22 09:56:01.153 225317 DEBUG nova.network.neutron [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:56:01 np0005591762 nova_compute[225313]: 2026-01-22 09:56:01.169 225317 DEBUG oslo_concurrency.lockutils [req-7e901b61-577c-4f04-9685-25452ebb0fdd req-15d7680b-4873-45eb-90c3-a46f32a0c3f4 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:56:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:01.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:03 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Scheduled restart job, restart counter is at 15.
Jan 22 04:56:03 np0005591762 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:56:03 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Start request repeated too quickly.
Jan 22 04:56:03 np0005591762 systemd[1]: ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2@nfs.cephfs.1.0.compute-2.qniaxp.service: Failed with result 'exit-code'.
Jan 22 04:56:03 np0005591762 systemd[1]: Failed to start Ceph nfs.cephfs.1.0.compute-2.qniaxp for 43df7a30-cf5f-5209-adfd-bf44298b19f2.
Jan 22 04:56:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:03.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:03 np0005591762 nova_compute[225313]: 2026-01-22 09:56:03.731 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:03.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:04 np0005591762 podman[230331]: 2026-01-22 09:56:04.834003395 +0000 UTC m=+0.050731300 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 04:56:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:05.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:05 np0005591762 nova_compute[225313]: 2026-01-22 09:56:05.227 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:05.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:06 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:06Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:5d:ec 10.100.0.5
Jan 22 04:56:06 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:06Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:5d:ec 10.100.0.5
Jan 22 04:56:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:07.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:07.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:08 np0005591762 nova_compute[225313]: 2026-01-22 09:56:08.736 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:09.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:09.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:10 np0005591762 nova_compute[225313]: 2026-01-22 09:56:10.229 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:56:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:11.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:56:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:56:11 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1742191310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:56:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:11 np0005591762 nova_compute[225313]: 2026-01-22 09:56:11.704 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:11 np0005591762 nova_compute[225313]: 2026-01-22 09:56:11.705 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:11 np0005591762 nova_compute[225313]: 2026-01-22 09:56:11.716 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:11 np0005591762 nova_compute[225313]: 2026-01-22 09:56:11.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:11 np0005591762 nova_compute[225313]: 2026-01-22 09:56:11.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:56:11 np0005591762 nova_compute[225313]: 2026-01-22 09:56:11.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:56:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:11.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.029 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.030 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.030 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.030 225317 DEBUG nova.objects.instance [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:56:12 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:56:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:56:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:56:12 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:56:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.700 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.711 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.711 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.712 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.712 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:12 np0005591762 nova_compute[225313]: 2026-01-22 09:56:12.712 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:56:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:13.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:13 np0005591762 nova_compute[225313]: 2026-01-22 09:56:13.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:13 np0005591762 nova_compute[225313]: 2026-01-22 09:56:13.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:13 np0005591762 nova_compute[225313]: 2026-01-22 09:56:13.737 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:13 np0005591762 podman[230460]: 2026-01-22 09:56:13.835914846 +0000 UTC m=+0.059034149 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:56:13 np0005591762 nova_compute[225313]: 2026-01-22 09:56:13.857 225317 INFO nova.compute.manager [None req-25063e6b-d0e0-4dd5-ac89-1505259a367d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Get console output#033[00m
Jan 22 04:56:13 np0005591762 nova_compute[225313]: 2026-01-22 09:56:13.860 225317 INFO oslo.privsep.daemon [None req-25063e6b-d0e0-4dd5-ac89-1505259a367d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpt079fweh/privsep.sock']#033[00m
Jan 22 04:56:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:13.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:14 np0005591762 nova_compute[225313]: 2026-01-22 09:56:14.398 225317 INFO oslo.privsep.daemon [None req-25063e6b-d0e0-4dd5-ac89-1505259a367d 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 22 04:56:14 np0005591762 nova_compute[225313]: 2026-01-22 09:56:14.313 230487 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 22 04:56:14 np0005591762 nova_compute[225313]: 2026-01-22 09:56:14.316 230487 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 22 04:56:14 np0005591762 nova_compute[225313]: 2026-01-22 09:56:14.318 230487 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 22 04:56:14 np0005591762 nova_compute[225313]: 2026-01-22 09:56:14.318 230487 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230487#033[00m
Jan 22 04:56:14 np0005591762 nova_compute[225313]: 2026-01-22 09:56:14.474 230487 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 04:56:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:15.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:15 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:15.076 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:56:15 np0005591762 nova_compute[225313]: 2026-01-22 09:56:15.076 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:15 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:15.077 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:56:15 np0005591762 nova_compute[225313]: 2026-01-22 09:56:15.231 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:56:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:56:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:15.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:16 np0005591762 nova_compute[225313]: 2026-01-22 09:56:16.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:17.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.119 225317 DEBUG oslo_concurrency.lockutils [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "interface-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.119 225317 DEBUG oslo_concurrency.lockutils [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "interface-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.119 225317 DEBUG nova.objects.instance [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'flavor' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.348 225317 DEBUG nova.objects.instance [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'pci_requests' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.357 225317 DEBUG nova.network.neutron [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.468 225317 DEBUG nova.policy [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4428dd9b0fb64c25b8f33b0050d4ef6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 04:56:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:17 np0005591762 nova_compute[225313]: 2026-01-22 09:56:17.866 225317 DEBUG nova.network.neutron [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Successfully created port: 16fe0c70-35aa-4775-8e29-47f94379a9ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 04:56:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:17.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.596 225317 DEBUG nova.network.neutron [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Successfully updated port: 16fe0c70-35aa-4775-8e29-47f94379a9ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.612 225317 DEBUG oslo_concurrency.lockutils [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.613 225317 DEBUG oslo_concurrency.lockutils [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.613 225317 DEBUG nova.network.neutron [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:56:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.690 225317 DEBUG nova.compute.manager [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-changed-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.690 225317 DEBUG nova.compute.manager [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing instance network info cache due to event network-changed-16fe0c70-35aa-4775-8e29-47f94379a9ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.690 225317 DEBUG oslo_concurrency.lockutils [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.738 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.738 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:56:18 np0005591762 nova_compute[225313]: 2026-01-22 09:56:18.753 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:56:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:19.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:56:19 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:56:19 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/78879305' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.077 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.125 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.125 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.334 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.335 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4656MB free_disk=59.94276428222656GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.335 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.335 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.408 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.408 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.408 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.440 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:56:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:19 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:56:19 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/990783688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.780 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.339s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.784 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.797 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.815 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.815 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.969 225317 DEBUG nova.network.neutron [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.982 225317 DEBUG oslo_concurrency.lockutils [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.983 225317 DEBUG oslo_concurrency.lockutils [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:56:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.983 225317 DEBUG nova.network.neutron [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing network info cache for port 16fe0c70-35aa-4775-8e29-47f94379a9ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:56:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:19.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.986 225317 DEBUG nova.virt.libvirt.vif [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.986 225317 DEBUG nova.network.os_vif_util [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.987 225317 DEBUG nova.network.os_vif_util [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.987 225317 DEBUG os_vif [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.988 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.988 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.989 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.990 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.991 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16fe0c70-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.991 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap16fe0c70-35, col_values=(('external_ids', {'iface-id': '16fe0c70-35aa-4775-8e29-47f94379a9ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:23:b9', 'vm-uuid': 'c2a740a7-21a6-42d9-9b2f-8ba5143e0cec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:19 np0005591762 NetworkManager[48910]: <info>  [1769075779.9931] manager: (tap16fe0c70-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.995 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.997 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.998 225317 INFO os_vif [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35')#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.998 225317 DEBUG nova.virt.libvirt.vif [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.999 225317 DEBUG nova.network.os_vif_util [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:56:19 np0005591762 nova_compute[225313]: 2026-01-22 09:56:19.999 225317 DEBUG nova.network.os_vif_util [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.001 225317 DEBUG nova.virt.libvirt.guest [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] attach device xml: <interface type="ethernet">
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <mac address="fa:16:3e:f3:23:b9"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <model type="virtio"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <mtu size="1442"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <target dev="tap16fe0c70-35"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]: </interface>
Jan 22 04:56:20 np0005591762 nova_compute[225313]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 22 04:56:20 np0005591762 kernel: tap16fe0c70-35: entered promiscuous mode
Jan 22 04:56:20 np0005591762 NetworkManager[48910]: <info>  [1769075780.0096] manager: (tap16fe0c70-35): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.010 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:20Z|00044|binding|INFO|Claiming lport 16fe0c70-35aa-4775-8e29-47f94379a9ae for this chassis.
Jan 22 04:56:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:20Z|00045|binding|INFO|16fe0c70-35aa-4775-8e29-47f94379a9ae: Claiming fa:16:3e:f3:23:b9 10.100.0.24
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.016 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:23:b9 10.100.0.24'], port_security=['fa:16:3e:f3:23:b9 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'c2a740a7-21a6-42d9-9b2f-8ba5143e0cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-860d8653-9bd9-4e89-8273-d52438f39b9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '36f82e25-219e-420f-acf7-94f16329ca95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3801e067-ec62-4ee2-a3f1-e0f0f4dedd37, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=16fe0c70-35aa-4775-8e29-47f94379a9ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.017 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 16fe0c70-35aa-4775-8e29-47f94379a9ae in datapath 860d8653-9bd9-4e89-8273-d52438f39b9f bound to our chassis#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.018 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 860d8653-9bd9-4e89-8273-d52438f39b9f#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.026 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[71f10234-9237-4fec-9165-54a287f2d740]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.027 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap860d8653-91 in ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.028 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap860d8653-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.028 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[af4ff9e7-8589-41c3-b2f1-d68d1ba98d59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.028 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7f5578-c83c-407b-a522-db465c45f9d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.038 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[455fd335-1868-4ffc-84fb-b00f9ceaed57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 systemd-udevd[230573]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.054 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:20Z|00046|binding|INFO|Setting lport 16fe0c70-35aa-4775-8e29-47f94379a9ae ovn-installed in OVS
Jan 22 04:56:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:20Z|00047|binding|INFO|Setting lport 16fe0c70-35aa-4775-8e29-47f94379a9ae up in Southbound
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.056 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 NetworkManager[48910]: <info>  [1769075780.0621] device (tap16fe0c70-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:56:20 np0005591762 NetworkManager[48910]: <info>  [1769075780.0628] device (tap16fe0c70-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.064 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fed28b-878b-4b6b-abb5-bcb11527bdd4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.079 225317 DEBUG nova.virt.libvirt.driver [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.079 225317 DEBUG nova.virt.libvirt.driver [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.080 225317 DEBUG nova.virt.libvirt.driver [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:58:5d:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.080 225317 DEBUG nova.virt.libvirt.driver [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:f3:23:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.085 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[eb117227-8968-41e1-a4cc-60891f472b88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 NetworkManager[48910]: <info>  [1769075780.0894] manager: (tap860d8653-90): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.090 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[4dae8052-3400-4718-8e38-f8e9b3a156c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 systemd-udevd[230576]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.095 225317 DEBUG nova.virt.libvirt.guest [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:56:20</nova:creationTime>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:56:20 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    <nova:port uuid="16fe0c70-35aa-4775-8e29-47f94379a9ae">
Jan 22 04:56:20 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:56:20 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:56:20 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:56:20 np0005591762 nova_compute[225313]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.113 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[73529be1-97c9-404b-8d20-7fb1c7a095c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.115 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[53c55e81-001a-4a4f-8127-a496955b5e1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.121 225317 DEBUG oslo_concurrency.lockutils [None req-3ecc0c87-1f8e-409d-adb8-b72b59ccf502 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "interface-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 3.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:56:20 np0005591762 NetworkManager[48910]: <info>  [1769075780.1340] device (tap860d8653-90): carrier: link connected
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.139 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[814fe9a1-d765-465a-a323-6e501a68d23b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.152 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[4aff587f-2942-439b-b274-bbc5d33c29f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap860d8653-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:56:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 335769, 'reachable_time': 35407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230590, 'error': None, 'target': 'ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.164 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[253a66ce-012d-4623-a211-3b94d4d3c41e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:5603'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 335769, 'tstamp': 335769}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230591, 'error': None, 'target': 'ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.175 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8c4de9-5256-4006-9649-00059f787d92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap860d8653-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:56:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 335769, 'reachable_time': 35407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230592, 'error': None, 'target': 'ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.196 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5549f9-f63b-4a19-92ae-6a332336bfde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.233 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.238 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[60c5bfe6-5dec-47e0-a120-b5e320d21b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.239 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap860d8653-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.239 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.239 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap860d8653-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:20 np0005591762 kernel: tap860d8653-90: entered promiscuous mode
Jan 22 04:56:20 np0005591762 NetworkManager[48910]: <info>  [1769075780.2419] manager: (tap860d8653-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.240 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.245 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap860d8653-90, col_values=(('external_ids', {'iface-id': '7840162c-ff18-42e9-b217-4c26ac577dbf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.245 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:20Z|00048|binding|INFO|Releasing lport 7840162c-ff18-42e9-b217-4c26ac577dbf from this chassis (sb_readonly=0)
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.248 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/860d8653-9bd9-4e89-8273-d52438f39b9f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/860d8653-9bd9-4e89-8273-d52438f39b9f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.248 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[07384bfc-f551-433e-ad5c-cd200ee75e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.249 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-860d8653-9bd9-4e89-8273-d52438f39b9f
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/860d8653-9bd9-4e89-8273-d52438f39b9f.pid.haproxy
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID 860d8653-9bd9-4e89-8273-d52438f39b9f
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 04:56:20 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:20.249 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f', 'env', 'PROCESS_TAG=haproxy-860d8653-9bd9-4e89-8273-d52438f39b9f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/860d8653-9bd9-4e89-8273-d52438f39b9f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 04:56:20 np0005591762 nova_compute[225313]: 2026-01-22 09:56:20.259 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:20 np0005591762 podman[230620]: 2026-01-22 09:56:20.522980624 +0000 UTC m=+0.032494765 container create 892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 04:56:20 np0005591762 systemd[1]: Started libpod-conmon-892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146.scope.
Jan 22 04:56:20 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:56:20 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24cb637187fffbc12d7c63ec90226e3c0aeaa9477832bce8f881ff2be0f054cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:56:20 np0005591762 podman[230620]: 2026-01-22 09:56:20.575838392 +0000 UTC m=+0.085352534 container init 892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 04:56:20 np0005591762 podman[230620]: 2026-01-22 09:56:20.579716638 +0000 UTC m=+0.089230778 container start 892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 04:56:20 np0005591762 podman[230620]: 2026-01-22 09:56:20.508426235 +0000 UTC m=+0.017940386 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:56:20 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [NOTICE]   (230637) : New worker (230639) forked
Jan 22 04:56:20 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [NOTICE]   (230637) : Loading success.
Jan 22 04:56:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:21.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095621 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:56:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:21 np0005591762 nova_compute[225313]: 2026-01-22 09:56:21.852 225317 DEBUG nova.compute.manager [req-3167331d-d2dc-405a-8fae-a835ecc27431 req-a14f13a4-b4e5-495d-976d-38c32094d759 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:56:21 np0005591762 nova_compute[225313]: 2026-01-22 09:56:21.852 225317 DEBUG oslo_concurrency.lockutils [req-3167331d-d2dc-405a-8fae-a835ecc27431 req-a14f13a4-b4e5-495d-976d-38c32094d759 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:56:21 np0005591762 nova_compute[225313]: 2026-01-22 09:56:21.853 225317 DEBUG oslo_concurrency.lockutils [req-3167331d-d2dc-405a-8fae-a835ecc27431 req-a14f13a4-b4e5-495d-976d-38c32094d759 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:56:21 np0005591762 nova_compute[225313]: 2026-01-22 09:56:21.853 225317 DEBUG oslo_concurrency.lockutils [req-3167331d-d2dc-405a-8fae-a835ecc27431 req-a14f13a4-b4e5-495d-976d-38c32094d759 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:56:21 np0005591762 nova_compute[225313]: 2026-01-22 09:56:21.853 225317 DEBUG nova.compute.manager [req-3167331d-d2dc-405a-8fae-a835ecc27431 req-a14f13a4-b4e5-495d-976d-38c32094d759 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:56:21 np0005591762 nova_compute[225313]: 2026-01-22 09:56:21.854 225317 WARNING nova.compute.manager [req-3167331d-d2dc-405a-8fae-a835ecc27431 req-a14f13a4-b4e5-495d-976d-38c32094d759 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae for instance with vm_state active and task_state None.#033[00m
Jan 22 04:56:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:21.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:22 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:22Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f3:23:b9 10.100.0.24
Jan 22 04:56:22 np0005591762 ovn_controller[133622]: 2026-01-22T09:56:22Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:23:b9 10.100.0.24
Jan 22 04:56:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:23.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.282 225317 DEBUG nova.network.neutron [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updated VIF entry in instance network info cache for port 16fe0c70-35aa-4775-8e29-47f94379a9ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.283 225317 DEBUG nova.network.neutron [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.294 225317 DEBUG oslo_concurrency.lockutils [req-65305b76-7bb5-4989-b156-cdb8cc79e327 req-430d112a-fee8-4321-9b7e-c6325ef2547d e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:56:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.923 225317 DEBUG nova.compute.manager [req-c39630b9-b1e1-4d59-b31b-3cdf3e1c8ed6 req-b42f62b8-713f-44a4-813d-af04b0df2e57 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.923 225317 DEBUG oslo_concurrency.lockutils [req-c39630b9-b1e1-4d59-b31b-3cdf3e1c8ed6 req-b42f62b8-713f-44a4-813d-af04b0df2e57 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.923 225317 DEBUG oslo_concurrency.lockutils [req-c39630b9-b1e1-4d59-b31b-3cdf3e1c8ed6 req-b42f62b8-713f-44a4-813d-af04b0df2e57 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.923 225317 DEBUG oslo_concurrency.lockutils [req-c39630b9-b1e1-4d59-b31b-3cdf3e1c8ed6 req-b42f62b8-713f-44a4-813d-af04b0df2e57 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.924 225317 DEBUG nova.compute.manager [req-c39630b9-b1e1-4d59-b31b-3cdf3e1c8ed6 req-b42f62b8-713f-44a4-813d-af04b0df2e57 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:56:23 np0005591762 nova_compute[225313]: 2026-01-22 09:56:23.924 225317 WARNING nova.compute.manager [req-c39630b9-b1e1-4d59-b31b-3cdf3e1c8ed6 req-b42f62b8-713f-44a4-813d-af04b0df2e57 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae for instance with vm_state active and task_state None.#033[00m
Jan 22 04:56:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:23.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:24 np0005591762 nova_compute[225313]: 2026-01-22 09:56:24.993 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:25.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:25 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:25.079 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:56:25 np0005591762 nova_compute[225313]: 2026-01-22 09:56:25.234 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:25.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:56:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:27.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:56:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:27.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:56:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:29.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:56:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:29.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:29 np0005591762 nova_compute[225313]: 2026-01-22 09:56:29.995 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:30 np0005591762 nova_compute[225313]: 2026-01-22 09:56:30.235 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:31.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:31.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:33.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:33.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:34 np0005591762 nova_compute[225313]: 2026-01-22 09:56:34.997 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:35.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:35 np0005591762 nova_compute[225313]: 2026-01-22 09:56:35.238 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:35 np0005591762 podman[230684]: 2026-01-22 09:56:35.813303049 +0000 UTC m=+0.035189409 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:56:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:36.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:38.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:39.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:39 np0005591762 nova_compute[225313]: 2026-01-22 09:56:39.999 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:40.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:40 np0005591762 nova_compute[225313]: 2026-01-22 09:56:40.239 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:41.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:42.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:43.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:44.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:44 np0005591762 podman[230709]: 2026-01-22 09:56:44.842958915 +0000 UTC m=+0.061264814 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:56:45 np0005591762 nova_compute[225313]: 2026-01-22 09:56:44.999 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:45.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:45 np0005591762 nova_compute[225313]: 2026-01-22 09:56:45.241 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:46.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:47.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:47.199 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:56:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:47.199 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:56:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:56:47.200 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:56:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:48.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:49.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:50 np0005591762 nova_compute[225313]: 2026-01-22 09:56:50.001 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:50.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:50 np0005591762 nova_compute[225313]: 2026-01-22 09:56:50.242 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:52.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:52 np0005591762 nova_compute[225313]: 2026-01-22 09:56:52.050 225317 DEBUG nova.compute.manager [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-changed-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:56:52 np0005591762 nova_compute[225313]: 2026-01-22 09:56:52.050 225317 DEBUG nova.compute.manager [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing instance network info cache due to event network-changed-16fe0c70-35aa-4775-8e29-47f94379a9ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:56:52 np0005591762 nova_compute[225313]: 2026-01-22 09:56:52.050 225317 DEBUG oslo_concurrency.lockutils [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:56:52 np0005591762 nova_compute[225313]: 2026-01-22 09:56:52.051 225317 DEBUG oslo_concurrency.lockutils [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:56:52 np0005591762 nova_compute[225313]: 2026-01-22 09:56:52.051 225317 DEBUG nova.network.neutron [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing network info cache for port 16fe0c70-35aa-4775-8e29-47f94379a9ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:56:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:53.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:54.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:54 np0005591762 nova_compute[225313]: 2026-01-22 09:56:54.250 225317 DEBUG nova.network.neutron [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updated VIF entry in instance network info cache for port 16fe0c70-35aa-4775-8e29-47f94379a9ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:56:54 np0005591762 nova_compute[225313]: 2026-01-22 09:56:54.251 225317 DEBUG nova.network.neutron [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:56:54 np0005591762 nova_compute[225313]: 2026-01-22 09:56:54.266 225317 DEBUG oslo_concurrency.lockutils [req-2a3f7532-4ebf-49e6-89c3-9b03e7cf2994 req-e6f49c77-229f-4a40-af07-ec2deaa250be e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:56:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:55 np0005591762 nova_compute[225313]: 2026-01-22 09:56:55.002 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:55 np0005591762 nova_compute[225313]: 2026-01-22 09:56:55.244 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:56:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:56:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:56.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:56:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:56:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:56:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:57.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:56:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:56:58.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:56:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:56:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:56:59.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:56:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:56:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:56:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:56:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:00 np0005591762 nova_compute[225313]: 2026-01-22 09:57:00.004 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:00.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:00 np0005591762 nova_compute[225313]: 2026-01-22 09:57:00.246 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:01.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.289 225317 DEBUG oslo_concurrency.lockutils [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "interface-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-16fe0c70-35aa-4775-8e29-47f94379a9ae" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.289 225317 DEBUG oslo_concurrency.lockutils [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "interface-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-16fe0c70-35aa-4775-8e29-47f94379a9ae" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.299 225317 DEBUG nova.objects.instance [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'flavor' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.316 225317 DEBUG nova.virt.libvirt.vif [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.316 225317 DEBUG nova.network.os_vif_util [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.317 225317 DEBUG nova.network.os_vif_util [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.319 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.321 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.322 225317 DEBUG nova.virt.libvirt.driver [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Attempting to detach device tap16fe0c70-35 from instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.322 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] detach device xml: <interface type="ethernet">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <mac address="fa:16:3e:f3:23:b9"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <model type="virtio"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <mtu size="1442"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <target dev="tap16fe0c70-35"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </interface>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.325 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.327 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface>not found in domain: <domain type='kvm' id='2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <name>instance-00000006</name>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <uuid>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</uuid>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:56:20</nova:creationTime>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:port uuid="16fe0c70-35aa-4775-8e29-47f94379a9ae">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <memory unit='KiB'>131072</memory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <vcpu placement='static'>1</vcpu>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <resource>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <partition>/machine</partition>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </resource>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <sysinfo type='smbios'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='manufacturer'>RDO</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='serial'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='uuid'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='family'>Virtual Machine</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <boot dev='hd'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <smbios mode='sysinfo'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <vmcoreinfo state='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <vendor>AMD</vendor>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='x2apic'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc-deadline'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='hypervisor'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc_adjust'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='vaes'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='spec-ctrl'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='stibp'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='ssbd'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='cmp_legacy'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='overflow-recov'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='succor'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='virt-ssbd'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='lbrv'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='tsc-scale'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vmcb-clean'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='flushbyasid'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pause-filter'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pfthreshold'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='v-vmsave-vmload'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vgif'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svm'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='topoext'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='npt'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='nrip-save'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <clock offset='utc'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <timer name='hpet' present='no'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <on_poweroff>destroy</on_poweroff>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <on_reboot>restart</on_reboot>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <on_crash>destroy</on_crash>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <disk type='network' device='disk'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk' index='2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='vda' bus='virtio'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='virtio-disk0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <disk type='network' device='cdrom'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config' index='1'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='sda' bus='sata'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <readonly/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='sata0-0-0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pcie.0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='1' port='0x10'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='2' port='0x11'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='3' port='0x12'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='4' port='0x13'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='5' port='0x14'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='6' port='0x15'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='7' port='0x16'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='8' port='0x17'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.8'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='9' port='0x18'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.9'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='10' port='0x19'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.10'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='11' port='0x1a'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.11'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='12' port='0x1b'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.12'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='13' port='0x1c'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.13'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='14' port='0x1d'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.14'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='15' port='0x1e'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.15'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='16' port='0x1f'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.16'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='17' port='0x20'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.17'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='18' port='0x21'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.18'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='19' port='0x22'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.19'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='20' port='0x23'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.20'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='21' port='0x24'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.21'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='22' port='0x25'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.22'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='23' port='0x26'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.23'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='24' port='0x27'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.24'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='25' port='0x28'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.25'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-pci-bridge'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.26'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='usb'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='sata' index='0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='ide'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <interface type='ethernet'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <mac address='fa:16:3e:58:5d:ec'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='tap2c18aeb2-0a'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model type='virtio'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <mtu size='1442'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='net0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <interface type='ethernet'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <mac address='fa:16:3e:f3:23:b9'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='tap16fe0c70-35'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model type='virtio'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <mtu size='1442'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='net1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <serial type='pty'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target type='isa-serial' port='0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <model name='isa-serial'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </target>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target type='serial' port='0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <input type='tablet' bus='usb'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='input0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='usb' bus='0' port='1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <input type='mouse' bus='ps2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='input1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <input type='keyboard' bus='ps2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='input2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <listen type='address' address='::0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <audio id='1' type='none'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='video0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <watchdog model='itco' action='reset'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='watchdog0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </watchdog>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <memballoon model='virtio'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <stats period='10'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='balloon0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <rng model='virtio'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <backend model='random'>/dev/urandom</backend>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='rng0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <label>system_u:system_r:svirt_t:s0:c522,c604</label>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c522,c604</imagelabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <label>+107:+107</label>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <imagelabel>+107:+107</imagelabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.329 225317 INFO nova.virt.libvirt.driver [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully detached device tap16fe0c70-35 from instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec from the persistent domain config.#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.330 225317 DEBUG nova.virt.libvirt.driver [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] (1/8): Attempting to detach device tap16fe0c70-35 with device alias net1 from instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.330 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] detach device xml: <interface type="ethernet">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <mac address="fa:16:3e:f3:23:b9"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <model type="virtio"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <mtu size="1442"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <target dev="tap16fe0c70-35"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </interface>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 22 04:57:01 np0005591762 kernel: tap16fe0c70-35 (unregistering): left promiscuous mode
Jan 22 04:57:01 np0005591762 NetworkManager[48910]: <info>  [1769075821.4271] device (tap16fe0c70-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.431 225317 DEBUG nova.virt.libvirt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Received event <DeviceRemovedEvent: 1769075821.431555, c2a740a7-21a6-42d9-9b2f-8ba5143e0cec => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 22 04:57:01 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:01Z|00049|binding|INFO|Releasing lport 16fe0c70-35aa-4775-8e29-47f94379a9ae from this chassis (sb_readonly=0)
Jan 22 04:57:01 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:01Z|00050|binding|INFO|Setting lport 16fe0c70-35aa-4775-8e29-47f94379a9ae down in Southbound
Jan 22 04:57:01 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:01Z|00051|binding|INFO|Removing iface tap16fe0c70-35 ovn-installed in OVS
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.434 225317 DEBUG nova.virt.libvirt.driver [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Start waiting for the detach event from libvirt for device tap16fe0c70-35 with device alias net1 for instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.434 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.434 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.435 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.440 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:23:b9 10.100.0.24', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'c2a740a7-21a6-42d9-9b2f-8ba5143e0cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-860d8653-9bd9-4e89-8273-d52438f39b9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3801e067-ec62-4ee2-a3f1-e0f0f4dedd37, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=16fe0c70-35aa-4775-8e29-47f94379a9ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.441 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 16fe0c70-35aa-4775-8e29-47f94379a9ae in datapath 860d8653-9bd9-4e89-8273-d52438f39b9f unbound from our chassis#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.442 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 860d8653-9bd9-4e89-8273-d52438f39b9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.443 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[f71d864c-d10f-4ed1-8995-981044804877]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.443 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f namespace which is not needed anymore#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.443 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface>not found in domain: <domain type='kvm' id='2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <name>instance-00000006</name>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <uuid>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</uuid>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:56:20</nova:creationTime>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:port uuid="16fe0c70-35aa-4775-8e29-47f94379a9ae">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <memory unit='KiB'>131072</memory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <vcpu placement='static'>1</vcpu>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <resource>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <partition>/machine</partition>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </resource>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <sysinfo type='smbios'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='manufacturer'>RDO</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='serial'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='uuid'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <entry name='family'>Virtual Machine</entry>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <boot dev='hd'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <smbios mode='sysinfo'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <vmcoreinfo state='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <vendor>AMD</vendor>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='x2apic'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc-deadline'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='hypervisor'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc_adjust'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='vaes'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='spec-ctrl'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='stibp'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='ssbd'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='cmp_legacy'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='overflow-recov'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='succor'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='virt-ssbd'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='lbrv'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='tsc-scale'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vmcb-clean'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='flushbyasid'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pause-filter'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pfthreshold'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='v-vmsave-vmload'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vgif'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svm'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='require' name='topoext'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='npt'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='nrip-save'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <clock offset='utc'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <timer name='hpet' present='no'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <on_poweroff>destroy</on_poweroff>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <on_reboot>restart</on_reboot>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <on_crash>destroy</on_crash>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <disk type='network' device='disk'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk' index='2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='vda' bus='virtio'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='virtio-disk0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <disk type='network' device='cdrom'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config' index='1'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='sda' bus='sata'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <readonly/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='sata0-0-0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pcie.0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='1' port='0x10'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='2' port='0x11'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='3' port='0x12'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='4' port='0x13'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='5' port='0x14'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='6' port='0x15'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='7' port='0x16'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='8' port='0x17'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.8'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='9' port='0x18'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.9'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='10' port='0x19'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.10'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='11' port='0x1a'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.11'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='12' port='0x1b'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.12'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='13' port='0x1c'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.13'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='14' port='0x1d'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.14'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='15' port='0x1e'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.15'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='16' port='0x1f'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.16'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='17' port='0x20'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.17'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='18' port='0x21'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.18'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='19' port='0x22'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.19'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='20' port='0x23'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.20'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='21' port='0x24'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.21'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='22' port='0x25'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.22'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='23' port='0x26'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.23'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='24' port='0x27'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.24'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target chassis='25' port='0x28'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.25'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model name='pcie-pci-bridge'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='pci.26'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='usb'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <controller type='sata' index='0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='ide'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <interface type='ethernet'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <mac address='fa:16:3e:58:5d:ec'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target dev='tap2c18aeb2-0a'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model type='virtio'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <mtu size='1442'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='net0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <serial type='pty'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target type='isa-serial' port='0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:        <model name='isa-serial'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      </target>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <target type='serial' port='0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <input type='tablet' bus='usb'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='input0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='usb' bus='0' port='1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <input type='mouse' bus='ps2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='input1'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <input type='keyboard' bus='ps2'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='input2'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <listen type='address' address='::0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <audio id='1' type='none'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='video0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <watchdog model='itco' action='reset'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='watchdog0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </watchdog>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <memballoon model='virtio'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <stats period='10'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='balloon0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <rng model='virtio'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <backend model='random'>/dev/urandom</backend>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <alias name='rng0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <label>system_u:system_r:svirt_t:s0:c522,c604</label>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c522,c604</imagelabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <label>+107:+107</label>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <imagelabel>+107:+107</imagelabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.443 225317 INFO nova.virt.libvirt.driver [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully detached device tap16fe0c70-35 from instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec from the live domain config.#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.444 225317 DEBUG nova.virt.libvirt.vif [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.446 225317 DEBUG nova.network.os_vif_util [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.447 225317 DEBUG nova.network.os_vif_util [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.447 225317 DEBUG os_vif [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.448 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.449 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16fe0c70-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.452 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.454 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.458 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.463 225317 INFO os_vif [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35')#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.463 225317 DEBUG nova.virt.libvirt.guest [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:57:01</nova:creationTime>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:57:01 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:01 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:57:01 np0005591762 nova_compute[225313]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 04:57:01 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [NOTICE]   (230637) : haproxy version is 2.8.14-c23fe91
Jan 22 04:57:01 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [NOTICE]   (230637) : path to executable is /usr/sbin/haproxy
Jan 22 04:57:01 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [WARNING]  (230637) : Exiting Master process...
Jan 22 04:57:01 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [WARNING]  (230637) : Exiting Master process...
Jan 22 04:57:01 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [ALERT]    (230637) : Current worker (230639) exited with code 143 (Terminated)
Jan 22 04:57:01 np0005591762 neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f[230633]: [WARNING]  (230637) : All workers exited. Exiting... (0)
Jan 22 04:57:01 np0005591762 systemd[1]: libpod-892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146.scope: Deactivated successfully.
Jan 22 04:57:01 np0005591762 podman[230794]: 2026-01-22 09:57:01.543587729 +0000 UTC m=+0.033660894 container died 892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 04:57:01 np0005591762 systemd[1]: var-lib-containers-storage-overlay-24cb637187fffbc12d7c63ec90226e3c0aeaa9477832bce8f881ff2be0f054cb-merged.mount: Deactivated successfully.
Jan 22 04:57:01 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146-userdata-shm.mount: Deactivated successfully.
Jan 22 04:57:01 np0005591762 podman[230794]: 2026-01-22 09:57:01.570819477 +0000 UTC m=+0.060892642 container cleanup 892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:57:01 np0005591762 systemd[1]: libpod-conmon-892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146.scope: Deactivated successfully.
Jan 22 04:57:01 np0005591762 podman[230818]: 2026-01-22 09:57:01.613300796 +0000 UTC m=+0.025772778 container remove 892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.619 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[0a8a8ae7-04cd-4d0e-a0bd-4fb44e1cb29b]: (4, ('Thu Jan 22 09:57:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f (892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146)\n892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146\nThu Jan 22 09:57:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f (892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146)\n892c395ee9b255a3f8b74412150e21cb7d88a53a0ad4a9b39a498546c4ab0146\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.620 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[8e67b1cf-7c9a-4ea9-8bc6-161f18dc4246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.621 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap860d8653-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.623 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 kernel: tap860d8653-90: left promiscuous mode
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.628 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[9d907980-2eb9-4b4c-8d71-deea3340c054]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 nova_compute[225313]: 2026-01-22 09:57:01.640 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.642 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[9d78de79-89be-450c-8fd9-2836ab3b5523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.643 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[dade9dc6-44fa-40fe-8cde-659c9a7938bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.654 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[01de5af8-464f-4251-afd4-6d5019cae3cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 335763, 'reachable_time': 21001, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230829, 'error': None, 'target': 'ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 systemd[1]: run-netns-ovnmeta\x2d860d8653\x2d9bd9\x2d4e89\x2d8273\x2dd52438f39b9f.mount: Deactivated successfully.
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.656 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-860d8653-9bd9-4e89-8273-d52438f39b9f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 04:57:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:01.656 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[8de8c6f0-7ae7-472d-bf65-f2b66c6f7a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:02.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:02 np0005591762 nova_compute[225313]: 2026-01-22 09:57:02.065 225317 DEBUG oslo_concurrency.lockutils [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:57:02 np0005591762 nova_compute[225313]: 2026-01-22 09:57:02.065 225317 DEBUG oslo_concurrency.lockutils [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:57:02 np0005591762 nova_compute[225313]: 2026-01-22 09:57:02.065 225317 DEBUG nova.network.neutron [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:57:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:03.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.497 225317 DEBUG nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-unplugged-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.497 225317 DEBUG oslo_concurrency.lockutils [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.498 225317 DEBUG oslo_concurrency.lockutils [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.498 225317 DEBUG oslo_concurrency.lockutils [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.499 225317 DEBUG nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-unplugged-16fe0c70-35aa-4775-8e29-47f94379a9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.499 225317 WARNING nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-unplugged-16fe0c70-35aa-4775-8e29-47f94379a9ae for instance with vm_state active and task_state None.#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.499 225317 DEBUG nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.499 225317 DEBUG oslo_concurrency.lockutils [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.500 225317 DEBUG oslo_concurrency.lockutils [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.500 225317 DEBUG oslo_concurrency.lockutils [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.500 225317 DEBUG nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.500 225317 WARNING nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-plugged-16fe0c70-35aa-4775-8e29-47f94379a9ae for instance with vm_state active and task_state None.#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.501 225317 DEBUG nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-deleted-16fe0c70-35aa-4775-8e29-47f94379a9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.501 225317 INFO nova.compute.manager [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Neutron deleted interface 16fe0c70-35aa-4775-8e29-47f94379a9ae; detaching it from the instance and deleting it from the info cache#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.501 225317 DEBUG nova.network.neutron [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.525 225317 DEBUG nova.objects.instance [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lazy-loading 'system_metadata' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.539 225317 DEBUG nova.objects.instance [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lazy-loading 'flavor' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.552 225317 DEBUG nova.virt.libvirt.vif [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.552 225317 DEBUG nova.network.os_vif_util [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Converting VIF {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.553 225317 DEBUG nova.network.os_vif_util [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.555 225317 DEBUG nova.virt.libvirt.guest [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.558 225317 DEBUG nova.virt.libvirt.guest [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface>not found in domain: <domain type='kvm' id='2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <name>instance-00000006</name>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <uuid>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</uuid>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:57:01</nova:creationTime>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <memory unit='KiB'>131072</memory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <vcpu placement='static'>1</vcpu>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <resource>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <partition>/machine</partition>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </resource>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <sysinfo type='smbios'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='manufacturer'>RDO</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='serial'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='uuid'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='family'>Virtual Machine</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <boot dev='hd'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <smbios mode='sysinfo'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <vmcoreinfo state='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <vendor>AMD</vendor>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='x2apic'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc-deadline'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='hypervisor'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc_adjust'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='vaes'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='spec-ctrl'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='stibp'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='ssbd'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='cmp_legacy'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='overflow-recov'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='succor'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='virt-ssbd'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='lbrv'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='tsc-scale'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vmcb-clean'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='flushbyasid'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pause-filter'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pfthreshold'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='v-vmsave-vmload'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vgif'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svm'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='topoext'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='npt'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='nrip-save'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <clock offset='utc'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <timer name='hpet' present='no'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <on_poweroff>destroy</on_poweroff>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <on_reboot>restart</on_reboot>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <on_crash>destroy</on_crash>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <disk type='network' device='disk'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk' index='2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target dev='vda' bus='virtio'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='virtio-disk0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <disk type='network' device='cdrom'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config' index='1'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target dev='sda' bus='sata'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <readonly/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='sata0-0-0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pcie.0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='1' port='0x10'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='2' port='0x11'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='3' port='0x12'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='4' port='0x13'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='5' port='0x14'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='6' port='0x15'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='7' port='0x16'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='8' port='0x17'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.8'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='9' port='0x18'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.9'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='10' port='0x19'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.10'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='11' port='0x1a'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.11'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='12' port='0x1b'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.12'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='13' port='0x1c'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.13'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='14' port='0x1d'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.14'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='15' port='0x1e'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.15'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='16' port='0x1f'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.16'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='17' port='0x20'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.17'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='18' port='0x21'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.18'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='19' port='0x22'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.19'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='20' port='0x23'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.20'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='21' port='0x24'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.21'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='22' port='0x25'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.22'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='23' port='0x26'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.23'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='24' port='0x27'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.24'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='25' port='0x28'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.25'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-pci-bridge'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.26'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='usb'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='sata' index='0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='ide'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <interface type='ethernet'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <mac address='fa:16:3e:58:5d:ec'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target dev='tap2c18aeb2-0a'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model type='virtio'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <mtu size='1442'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='net0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <serial type='pty'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target type='isa-serial' port='0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <model name='isa-serial'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </target>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target type='serial' port='0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <input type='tablet' bus='usb'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='input0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='usb' bus='0' port='1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <input type='mouse' bus='ps2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='input1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <input type='keyboard' bus='ps2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='input2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <listen type='address' address='::0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <audio id='1' type='none'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='video0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <watchdog model='itco' action='reset'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='watchdog0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </watchdog>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <memballoon model='virtio'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <stats period='10'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='balloon0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <rng model='virtio'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <backend model='random'>/dev/urandom</backend>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='rng0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <label>system_u:system_r:svirt_t:s0:c522,c604</label>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c522,c604</imagelabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <label>+107:+107</label>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <imagelabel>+107:+107</imagelabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.558 225317 DEBUG nova.virt.libvirt.guest [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.561 225317 DEBUG nova.virt.libvirt.guest [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:23:b9"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16fe0c70-35"/></interface>not found in domain: <domain type='kvm' id='2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <name>instance-00000006</name>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <uuid>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</uuid>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:57:01</nova:creationTime>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <memory unit='KiB'>131072</memory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <vcpu placement='static'>1</vcpu>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <resource>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <partition>/machine</partition>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </resource>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <sysinfo type='smbios'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='manufacturer'>RDO</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='serial'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='uuid'>c2a740a7-21a6-42d9-9b2f-8ba5143e0cec</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <entry name='family'>Virtual Machine</entry>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <boot dev='hd'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <smbios mode='sysinfo'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <vmcoreinfo state='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <model fallback='forbid'>EPYC-Milan</model>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <vendor>AMD</vendor>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='x2apic'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc-deadline'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='hypervisor'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='tsc_adjust'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='vaes'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='vpclmulqdq'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='spec-ctrl'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='stibp'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='ssbd'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='cmp_legacy'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='overflow-recov'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='succor'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='virt-ssbd'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='lbrv'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='tsc-scale'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vmcb-clean'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='flushbyasid'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pause-filter'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='pfthreshold'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='v-vmsave-vmload'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='vgif'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svm'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='require' name='topoext'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='npt'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='nrip-save'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <clock offset='utc'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <timer name='hpet' present='no'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <on_poweroff>destroy</on_poweroff>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <on_reboot>restart</on_reboot>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <on_crash>destroy</on_crash>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <disk type='network' device='disk'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk' index='2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target dev='vda' bus='virtio'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='virtio-disk0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <disk type='network' device='cdrom'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <auth username='openstack'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <secret type='ceph' uuid='43df7a30-cf5f-5209-adfd-bf44298b19f2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source protocol='rbd' name='vms/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_disk.config' index='1'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.100' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.102' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <host name='192.168.122.101' port='6789'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target dev='sda' bus='sata'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <readonly/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='sata0-0-0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pcie.0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='1' port='0x10'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='2' port='0x11'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='3' port='0x12'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='4' port='0x13'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='5' port='0x14'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='6' port='0x15'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='7' port='0x16'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='8' port='0x17'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.8'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='9' port='0x18'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.9'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='10' port='0x19'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.10'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='11' port='0x1a'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.11'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='12' port='0x1b'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.12'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='13' port='0x1c'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.13'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='14' port='0x1d'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.14'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='15' port='0x1e'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.15'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='16' port='0x1f'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.16'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='17' port='0x20'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.17'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='18' port='0x21'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.18'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='19' port='0x22'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.19'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='20' port='0x23'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.20'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='21' port='0x24'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.21'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='22' port='0x25'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.22'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='23' port='0x26'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.23'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='24' port='0x27'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.24'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-root-port'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target chassis='25' port='0x28'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.25'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model name='pcie-pci-bridge'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='pci.26'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='usb'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <controller type='sata' index='0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='ide'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </controller>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <interface type='ethernet'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <mac address='fa:16:3e:58:5d:ec'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target dev='tap2c18aeb2-0a'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model type='virtio'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <mtu size='1442'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='net0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <serial type='pty'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target type='isa-serial' port='0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:        <model name='isa-serial'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      </target>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <source path='/dev/pts/0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <log file='/var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec/console.log' append='off'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <target type='serial' port='0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='serial0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </console>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <input type='tablet' bus='usb'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='input0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='usb' bus='0' port='1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <input type='mouse' bus='ps2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='input1'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <input type='keyboard' bus='ps2'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='input2'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </input>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <listen type='address' address='::0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </graphics>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <audio id='1' type='none'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='video0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <watchdog model='itco' action='reset'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='watchdog0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </watchdog>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <memballoon model='virtio'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <stats period='10'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='balloon0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <rng model='virtio'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <backend model='random'>/dev/urandom</backend>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <alias name='rng0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <label>system_u:system_r:svirt_t:s0:c522,c604</label>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c522,c604</imagelabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <label>+107:+107</label>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <imagelabel>+107:+107</imagelabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </seclabel>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.562 225317 WARNING nova.virt.libvirt.driver [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Detaching interface fa:16:3e:f3:23:b9 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap16fe0c70-35' not found.#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.562 225317 DEBUG nova.virt.libvirt.vif [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.562 225317 DEBUG nova.network.os_vif_util [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Converting VIF {"id": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "address": "fa:16:3e:f3:23:b9", "network": {"id": "860d8653-9bd9-4e89-8273-d52438f39b9f", "bridge": "br-int", "label": "tempest-network-smoke--1379404610", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16fe0c70-35", "ovs_interfaceid": "16fe0c70-35aa-4775-8e29-47f94379a9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.563 225317 DEBUG nova.network.os_vif_util [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.563 225317 DEBUG os_vif [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.564 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.565 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16fe0c70-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.565 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.566 225317 INFO os_vif [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:23:b9,bridge_name='br-int',has_traffic_filtering=True,id=16fe0c70-35aa-4775-8e29-47f94379a9ae,network=Network(860d8653-9bd9-4e89-8273-d52438f39b9f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16fe0c70-35')#033[00m
Jan 22 04:57:03 np0005591762 nova_compute[225313]: 2026-01-22 09:57:03.567 225317 DEBUG nova.virt.libvirt.guest [req-edb22eab-d235-46da-827a-79449c50518d req-c0832c77-e0d6-44f5-988e-70c9610adfba e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:name>tempest-TestNetworkBasicOps-server-725337815</nova:name>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:creationTime>2026-01-22 09:57:03</nova:creationTime>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:flavor name="m1.nano">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:memory>128</nova:memory>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:disk>1</nova:disk>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:swap>0</nova:swap>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:flavor>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:owner>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:owner>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  <nova:ports>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    <nova:port uuid="2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f">
Jan 22 04:57:03 np0005591762 nova_compute[225313]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:    </nova:port>
Jan 22 04:57:03 np0005591762 nova_compute[225313]:  </nova:ports>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: </nova:instance>
Jan 22 04:57:03 np0005591762 nova_compute[225313]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 04:57:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:04.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.062 225317 INFO nova.network.neutron [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Port 16fe0c70-35aa-4775-8e29-47f94379a9ae from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.063 225317 DEBUG nova.network.neutron [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.076 225317 DEBUG oslo_concurrency.lockutils [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.090 225317 DEBUG oslo_concurrency.lockutils [None req-e252f241-a809-4256-b638-0a154078ba92 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "interface-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-16fe0c70-35aa-4775-8e29-47f94379a9ae" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:04 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:04Z|00052|binding|INFO|Releasing lport 1240fd1c-f79e-4e91-8b0f-3356e4c99edb from this chassis (sb_readonly=0)
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.285 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.889 225317 DEBUG nova.compute.manager [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-changed-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.890 225317 DEBUG nova.compute.manager [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing instance network info cache due to event network-changed-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.890 225317 DEBUG oslo_concurrency.lockutils [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.890 225317 DEBUG oslo_concurrency.lockutils [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.890 225317 DEBUG nova.network.neutron [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Refreshing network info cache for port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.935 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.935 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.935 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.936 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.936 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.937 225317 INFO nova.compute.manager [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Terminating instance#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.937 225317 DEBUG nova.compute.manager [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 04:57:04 np0005591762 kernel: tap2c18aeb2-0a (unregistering): left promiscuous mode
Jan 22 04:57:04 np0005591762 NetworkManager[48910]: <info>  [1769075824.9685] device (tap2c18aeb2-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 04:57:04 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:04Z|00053|binding|INFO|Releasing lport 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f from this chassis (sb_readonly=0)
Jan 22 04:57:04 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:04Z|00054|binding|INFO|Setting lport 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f down in Southbound
Jan 22 04:57:04 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:04Z|00055|binding|INFO|Removing iface tap2c18aeb2-0a ovn-installed in OVS
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.974 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.976 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:04 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:04.977 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:5d:ec 10.100.0.5'], port_security=['fa:16:3e:58:5d:ec 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c2a740a7-21a6-42d9-9b2f-8ba5143e0cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ac6bd202-9ce4-49d7-a7dd-c8aca89509c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3e6744-68b4-4126-a841-767d678dbcb8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:57:04 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:04.978 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f in datapath f3e7c2ec-12ff-4a29-aade-135175be50e3 unbound from our chassis#033[00m
Jan 22 04:57:04 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:04.979 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3e7c2ec-12ff-4a29-aade-135175be50e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 04:57:04 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:04.980 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[950733c2-1e50-49ea-aa02-44407d37d769]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:04 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:04.980 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3 namespace which is not needed anymore#033[00m
Jan 22 04:57:04 np0005591762 nova_compute[225313]: 2026-01-22 09:57:04.992 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 22 04:57:05 np0005591762 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 13.227s CPU time.
Jan 22 04:57:05 np0005591762 systemd-machined[193990]: Machine qemu-2-instance-00000006 terminated.
Jan 22 04:57:05 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [NOTICE]   (230270) : haproxy version is 2.8.14-c23fe91
Jan 22 04:57:05 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [NOTICE]   (230270) : path to executable is /usr/sbin/haproxy
Jan 22 04:57:05 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [WARNING]  (230270) : Exiting Master process...
Jan 22 04:57:05 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [ALERT]    (230270) : Current worker (230272) exited with code 143 (Terminated)
Jan 22 04:57:05 np0005591762 neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3[230266]: [WARNING]  (230270) : All workers exited. Exiting... (0)
Jan 22 04:57:05 np0005591762 systemd[1]: libpod-08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1.scope: Deactivated successfully.
Jan 22 04:57:05 np0005591762 podman[230856]: 2026-01-22 09:57:05.075375128 +0000 UTC m=+0.031791570 container died 08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 04:57:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:57:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:05.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:57:05 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1-userdata-shm.mount: Deactivated successfully.
Jan 22 04:57:05 np0005591762 systemd[1]: var-lib-containers-storage-overlay-c4db7cbdd87946036038f8b20dd520bda04ea47623dd8d0f428e73fb9a69fad5-merged.mount: Deactivated successfully.
Jan 22 04:57:05 np0005591762 podman[230856]: 2026-01-22 09:57:05.097089692 +0000 UTC m=+0.053506133 container cleanup 08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:57:05 np0005591762 systemd[1]: libpod-conmon-08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1.scope: Deactivated successfully.
Jan 22 04:57:05 np0005591762 podman[230882]: 2026-01-22 09:57:05.137950965 +0000 UTC m=+0.024753906 container remove 08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.142 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[a30d8391-dd6e-4908-aac6-36f5a7c8c399]: (4, ('Thu Jan 22 09:57:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3 (08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1)\n08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1\nThu Jan 22 09:57:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3 (08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1)\n08f67ea2c4e9653fb5da9bd345e28fc082c191223e27d8300e8437bf3d7576e1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.143 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[8e186e17-08a1-4aeb-b9c7-fc2ad9c5aefe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.144 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3e7c2ec-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.146 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 kernel: tapf3e7c2ec-10: left promiscuous mode
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.159 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.163 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.166 225317 INFO nova.virt.libvirt.driver [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Instance destroyed successfully.#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.166 225317 DEBUG nova.objects.instance [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'resources' on Instance uuid c2a740a7-21a6-42d9-9b2f-8ba5143e0cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.167 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[abea1b69-ab65-4d6a-8567-1f1673b6728c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.175 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e2c3e9-98e2-42ec-ba81-0dc66593a0dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.175 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[511ecc59-2761-42e7-a8fa-6c4fe998f41e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.179 225317 DEBUG nova.virt.libvirt.vif [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:55:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-725337815',display_name='tempest-TestNetworkBasicOps-server-725337815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-725337815',id=6,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEt9a9Cwrt1ky7Btn658kNlK7EWqurfln8dBy7UglIgF5StE3GFzjKPOZUBAJLUxCXTv7fRYRKKUDmu1I5Tz3oB+gjse5xMtizG6A6rAuXb+mwdfAvhLQYNgMvIeDx+IJg==',key_name='tempest-TestNetworkBasicOps-901785641',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:55:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-n0b8aens',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:55:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=c2a740a7-21a6-42d9-9b2f-8ba5143e0cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.179 225317 DEBUG nova.network.os_vif_util [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.179 225317 DEBUG nova.network.os_vif_util [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.180 225317 DEBUG os_vif [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.181 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.181 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c18aeb2-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.182 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.183 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.185 225317 INFO os_vif [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:5d:ec,bridge_name='br-int',has_traffic_filtering=True,id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f,network=Network(f3e7c2ec-12ff-4a29-aade-135175be50e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c18aeb2-0a')#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.188 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[819cfb8c-b11b-4ce7-b267-090ecd0e7b61]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333192, 'reachable_time': 20321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230904, 'error': None, 'target': 'ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.189 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f3e7c2ec-12ff-4a29-aade-135175be50e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 04:57:05 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:05.190 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[b15127d9-9961-41e2-888a-229926b748bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:05 np0005591762 systemd[1]: run-netns-ovnmeta\x2df3e7c2ec\x2d12ff\x2d4a29\x2daade\x2d135175be50e3.mount: Deactivated successfully.
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.249 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.345 225317 INFO nova.virt.libvirt.driver [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Deleting instance files /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_del#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.345 225317 INFO nova.virt.libvirt.driver [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Deletion of /var/lib/nova/instances/c2a740a7-21a6-42d9-9b2f-8ba5143e0cec_del complete#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.383 225317 INFO nova.compute.manager [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.383 225317 DEBUG oslo.service.loopingcall [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.384 225317 DEBUG nova.compute.manager [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 04:57:05 np0005591762 nova_compute[225313]: 2026-01-22 09:57:05.384 225317 DEBUG nova.network.neutron [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 04:57:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.216 225317 DEBUG nova.network.neutron [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.226 225317 INFO nova.compute.manager [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Took 0.84 seconds to deallocate network for instance.#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.256 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.257 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.291 225317 DEBUG oslo_concurrency.processutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.360 225317 DEBUG nova.network.neutron [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updated VIF entry in instance network info cache for port 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.361 225317 DEBUG nova.network.neutron [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Updating instance_info_cache with network_info: [{"id": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "address": "fa:16:3e:58:5d:ec", "network": {"id": "f3e7c2ec-12ff-4a29-aade-135175be50e3", "bridge": "br-int", "label": "tempest-network-smoke--475301959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c18aeb2-0a", "ovs_interfaceid": "2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.375 225317 DEBUG oslo_concurrency.lockutils [req-b0c11253-40d3-4fc1-b659-2adc3d11f3e7 req-830dcef4-2b9c-4596-ac60-b6a8989a2a80 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:57:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:57:06 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3407251127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.629 225317 DEBUG oslo_concurrency.processutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.633 225317 DEBUG nova.compute.provider_tree [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.645 225317 DEBUG nova.scheduler.client.report [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:57:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.669 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.691 225317 INFO nova.scheduler.client.report [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Deleted allocations for instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.730 225317 DEBUG oslo_concurrency.lockutils [None req-f3cf0057-fd0b-4d5d-8eaf-7c6c29bef417 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:06 np0005591762 podman[230949]: 2026-01-22 09:57:06.818928213 +0000 UTC m=+0.040347695 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 04:57:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.959 225317 DEBUG nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-unplugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.959 225317 DEBUG oslo_concurrency.lockutils [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.959 225317 DEBUG oslo_concurrency.lockutils [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.959 225317 DEBUG oslo_concurrency.lockutils [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 DEBUG nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-unplugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 WARNING nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-unplugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f for instance with vm_state deleted and task_state None.#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 DEBUG nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 DEBUG oslo_concurrency.lockutils [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 DEBUG oslo_concurrency.lockutils [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 DEBUG oslo_concurrency.lockutils [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "c2a740a7-21a6-42d9-9b2f-8ba5143e0cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.960 225317 DEBUG nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] No waiting events found dispatching network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.961 225317 WARNING nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received unexpected event network-vif-plugged-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f for instance with vm_state deleted and task_state None.#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.961 225317 DEBUG nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Received event network-vif-deleted-2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.961 225317 INFO nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Neutron deleted interface 2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f; detaching it from the instance and deleting it from the info cache#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.961 225317 DEBUG nova.network.neutron [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 22 04:57:06 np0005591762 nova_compute[225313]: 2026-01-22 09:57:06.963 225317 DEBUG nova.compute.manager [req-c04efddf-758b-4bd7-aaf4-21f394fc31be req-51323d8a-946f-4c6e-a099-8cc54800f780 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Detach interface failed, port_id=2c18aeb2-0a9a-4b61-82b7-99c5c1b1589f, reason: Instance c2a740a7-21a6-42d9-9b2f-8ba5143e0cec could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 22 04:57:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:08.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:09.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:10.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:10 np0005591762 nova_compute[225313]: 2026-01-22 09:57:10.071 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:10 np0005591762 nova_compute[225313]: 2026-01-22 09:57:10.153 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:10 np0005591762 nova_compute[225313]: 2026-01-22 09:57:10.182 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:10 np0005591762 nova_compute[225313]: 2026-01-22 09:57:10.252 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:11.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:12.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.811 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.811 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.812 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.812 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.824 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.824 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.824 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.824 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:12 np0005591762 nova_compute[225313]: 2026-01-22 09:57:12.824 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:57:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:13.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:14.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:14 np0005591762 nova_compute[225313]: 2026-01-22 09:57:14.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:15.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:15 np0005591762 podman[231023]: 2026-01-22 09:57:15.131662153 +0000 UTC m=+0.056143507 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 04:57:15 np0005591762 nova_compute[225313]: 2026-01-22 09:57:15.183 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:15 np0005591762 nova_compute[225313]: 2026-01-22 09:57:15.252 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:15 np0005591762 nova_compute[225313]: 2026-01-22 09:57:15.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:57:15 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:57:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:16.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:16 np0005591762 nova_compute[225313]: 2026-01-22 09:57:16.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:17 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:17.068 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:57:17 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:17.069 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:57:17 np0005591762 nova_compute[225313]: 2026-01-22 09:57:17.069 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:17.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:18.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:18 np0005591762 nova_compute[225313]: 2026-01-22 09:57:18.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:57:18 np0005591762 nova_compute[225313]: 2026-01-22 09:57:18.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:18 np0005591762 nova_compute[225313]: 2026-01-22 09:57:18.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:18 np0005591762 nova_compute[225313]: 2026-01-22 09:57:18.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:18 np0005591762 nova_compute[225313]: 2026-01-22 09:57:18.738 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:57:18 np0005591762 nova_compute[225313]: 2026-01-22 09:57:18.739 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.081 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.296 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.296 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4937MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.297 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.297 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.351 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.351 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.372 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:19 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:57:19 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2440340458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.712 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.340s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.715 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.749 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.765 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:57:19 np0005591762 nova_compute[225313]: 2026-01-22 09:57:19.765 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:19 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:57:19 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:57:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:20.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:20 np0005591762 nova_compute[225313]: 2026-01-22 09:57:20.163 225317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769075825.162398, c2a740a7-21a6-42d9-9b2f-8ba5143e0cec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:57:20 np0005591762 nova_compute[225313]: 2026-01-22 09:57:20.163 225317 INFO nova.compute.manager [-] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] VM Stopped (Lifecycle Event)#033[00m
Jan 22 04:57:20 np0005591762 nova_compute[225313]: 2026-01-22 09:57:20.180 225317 DEBUG nova.compute.manager [None req-b2cfaf00-2458-412a-ba11-2f359e3474b6 - - - - - -] [instance: c2a740a7-21a6-42d9-9b2f-8ba5143e0cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:57:20 np0005591762 nova_compute[225313]: 2026-01-22 09:57:20.184 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:20 np0005591762 nova_compute[225313]: 2026-01-22 09:57:20.254 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:22.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.188 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.189 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.200 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.243 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.243 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.248 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.248 225317 INFO nova.compute.claims [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.324 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:57:22 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2867766528' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.661 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.665 225317 DEBUG nova.compute.provider_tree [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:57:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.677 225317 DEBUG nova.scheduler.client.report [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.690 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.690 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.725 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.726 225317 DEBUG nova.network.neutron [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.740 225317 INFO nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.751 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.812 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.813 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.813 225317 INFO nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Creating image(s)#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.831 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.852 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.870 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.872 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.918 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.918 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "9db187949728ea707722fd244d769f131efa8688" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.919 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.919 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.938 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:22 np0005591762 nova_compute[225313]: 2026-01-22 09:57:22.939 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.066 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:23.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.110 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] resizing rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.157 225317 DEBUG nova.objects.instance [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'migration_context' on Instance uuid 8e9afaf2-adf6-46e6-8c20-227ea75186b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.168 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.169 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Ensure instance console log exists: /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.169 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.169 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.170 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:23 np0005591762 nova_compute[225313]: 2026-01-22 09:57:23.176 225317 DEBUG nova.policy [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4428dd9b0fb64c25b8f33b0050d4ef6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 04:57:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:24.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.290 225317 DEBUG nova.network.neutron [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Successfully updated port: e82a0b0a-fa8a-4ea6-98e1-12794778865d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.301 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.301 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.301 225317 DEBUG nova.network.neutron [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.371 225317 DEBUG nova.compute.manager [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-changed-e82a0b0a-fa8a-4ea6-98e1-12794778865d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.372 225317 DEBUG nova.compute.manager [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Refreshing instance network info cache due to event network-changed-e82a0b0a-fa8a-4ea6-98e1-12794778865d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.372 225317 DEBUG oslo_concurrency.lockutils [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:57:24 np0005591762 nova_compute[225313]: 2026-01-22 09:57:24.415 225317 DEBUG nova.network.neutron [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 04:57:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:57:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:25.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.135 225317 DEBUG nova.network.neutron [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Updating instance_info_cache with network_info: [{"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.149 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.149 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Instance network_info: |[{"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.149 225317 DEBUG oslo_concurrency.lockutils [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.150 225317 DEBUG nova.network.neutron [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Refreshing network info cache for port e82a0b0a-fa8a-4ea6-98e1-12794778865d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.152 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Start _get_guest_xml network_info=[{"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'encryption_options': None, 'image_id': 'bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.155 225317 WARNING nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.157 225317 DEBUG nova.virt.libvirt.host [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.158 225317 DEBUG nova.virt.libvirt.host [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.160 225317 DEBUG nova.virt.libvirt.host [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.160 225317 DEBUG nova.virt.libvirt.host [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.161 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.161 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T09:51:32Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6eff66ba-fb3e-4ca7-b05b-920b01d9affd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.161 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.162 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.162 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.162 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.162 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.163 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.163 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.163 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.163 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.164 225317 DEBUG nova.virt.hardware [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.166 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.185 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.256 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:25 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 04:57:25 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1716965634' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.516 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.350s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.534 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.536 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:25 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 04:57:25 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/19879308' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.876 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.340s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.877 225317 DEBUG nova.virt.libvirt.vif [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:57:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-848748631',display_name='tempest-TestNetworkBasicOps-server-848748631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-848748631',id=8,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKH+U724EaOKBknPaXf2RKgtd8DVtla/9cHasfvf1usw/C9M7fa/JShiAffn9j+2oIvZpWu8+s44VRhSwREi0UFAiqLCqWelOq+C+U2sTsmKXL58Y70x90AVi/BRbYQiBQ==',key_name='tempest-TestNetworkBasicOps-26159022',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-e4sn1fm8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:57:22Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=8e9afaf2-adf6-46e6-8c20-227ea75186b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.878 225317 DEBUG nova.network.os_vif_util [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.878 225317 DEBUG nova.network.os_vif_util [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.879 225317 DEBUG nova.objects.instance [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e9afaf2-adf6-46e6-8c20-227ea75186b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.890 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <uuid>8e9afaf2-adf6-46e6-8c20-227ea75186b1</uuid>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <name>instance-00000008</name>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <memory>131072</memory>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <vcpu>1</vcpu>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:name>tempest-TestNetworkBasicOps-server-848748631</nova:name>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:creationTime>2026-01-22 09:57:25</nova:creationTime>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:flavor name="m1.nano">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:memory>128</nova:memory>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:disk>1</nova:disk>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:swap>0</nova:swap>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:vcpus>1</nova:vcpus>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </nova:flavor>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:owner>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </nova:owner>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <nova:ports>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <nova:port uuid="e82a0b0a-fa8a-4ea6-98e1-12794778865d">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        </nova:port>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </nova:ports>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </nova:instance>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <sysinfo type="smbios">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <entry name="manufacturer">RDO</entry>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <entry name="product">OpenStack Compute</entry>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <entry name="serial">8e9afaf2-adf6-46e6-8c20-227ea75186b1</entry>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <entry name="uuid">8e9afaf2-adf6-46e6-8c20-227ea75186b1</entry>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <entry name="family">Virtual Machine</entry>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <boot dev="hd"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <smbios mode="sysinfo"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <vmcoreinfo/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <clock offset="utc">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <timer name="hpet" present="no"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <cpu mode="host-model" match="exact">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <disk type="network" device="disk">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <target dev="vda" bus="virtio"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <disk type="network" device="cdrom">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk.config">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <target dev="sda" bus="sata"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <interface type="ethernet">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <mac address="fa:16:3e:b7:aa:80"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <mtu size="1442"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <target dev="tape82a0b0a-fa"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <serial type="pty">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <log file="/var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/console.log" append="off"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <input type="tablet" bus="usb"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <rng model="virtio">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <backend model="random">/dev/urandom</backend>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <controller type="usb" index="0"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    <memballoon model="virtio">
Jan 22 04:57:25 np0005591762 nova_compute[225313]:      <stats period="10"/>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:57:25 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:57:25 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:57:25 np0005591762 nova_compute[225313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.891 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Preparing to wait for external event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.891 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.891 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.892 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.892 225317 DEBUG nova.virt.libvirt.vif [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:57:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-848748631',display_name='tempest-TestNetworkBasicOps-server-848748631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-848748631',id=8,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKH+U724EaOKBknPaXf2RKgtd8DVtla/9cHasfvf1usw/C9M7fa/JShiAffn9j+2oIvZpWu8+s44VRhSwREi0UFAiqLCqWelOq+C+U2sTsmKXL58Y70x90AVi/BRbYQiBQ==',key_name='tempest-TestNetworkBasicOps-26159022',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-e4sn1fm8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:57:22Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=8e9afaf2-adf6-46e6-8c20-227ea75186b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.892 225317 DEBUG nova.network.os_vif_util [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.893 225317 DEBUG nova.network.os_vif_util [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.893 225317 DEBUG os_vif [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.893 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.894 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.894 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.896 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.897 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape82a0b0a-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.897 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape82a0b0a-fa, col_values=(('external_ids', {'iface-id': 'e82a0b0a-fa8a-4ea6-98e1-12794778865d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:aa:80', 'vm-uuid': '8e9afaf2-adf6-46e6-8c20-227ea75186b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.898 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:25 np0005591762 NetworkManager[48910]: <info>  [1769075845.8989] manager: (tape82a0b0a-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.900 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.902 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.903 225317 INFO os_vif [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa')#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.931 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.932 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.932 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:b7:aa:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.932 225317 INFO nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Using config drive#033[00m
Jan 22 04:57:25 np0005591762 nova_compute[225313]: 2026-01-22 09:57:25.949 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:26.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.210 225317 INFO nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Creating config drive at /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/disk.config#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.214 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv7ymkhmo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.229 225317 DEBUG nova.network.neutron [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Updated VIF entry in instance network info cache for port e82a0b0a-fa8a-4ea6-98e1-12794778865d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.230 225317 DEBUG nova.network.neutron [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Updating instance_info_cache with network_info: [{"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.241 225317 DEBUG oslo_concurrency.lockutils [req-513cea75-bb09-4587-9c47-089bfb6866df req-077bd28a-99e5-47e0-a548-daea6efda30c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.331 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv7ymkhmo" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.350 225317 DEBUG nova.storage.rbd_utils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.353 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/disk.config 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.433 225317 DEBUG oslo_concurrency.processutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/disk.config 8e9afaf2-adf6-46e6-8c20-227ea75186b1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.433 225317 INFO nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Deleting local config drive /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1/disk.config because it was imported into RBD.#033[00m
Jan 22 04:57:26 np0005591762 kernel: tape82a0b0a-fa: entered promiscuous mode
Jan 22 04:57:26 np0005591762 NetworkManager[48910]: <info>  [1769075846.4644] manager: (tape82a0b0a-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 22 04:57:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:26Z|00056|binding|INFO|Claiming lport e82a0b0a-fa8a-4ea6-98e1-12794778865d for this chassis.
Jan 22 04:57:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:26Z|00057|binding|INFO|e82a0b0a-fa8a-4ea6-98e1-12794778865d: Claiming fa:16:3e:b7:aa:80 10.100.0.11
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.470 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.472 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.475 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:aa:80 10.100.0.11'], port_security=['fa:16:3e:b7:aa:80 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1829262121', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8e9afaf2-adf6-46e6-8c20-227ea75186b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1829262121', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '36f82e25-219e-420f-acf7-94f16329ca95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b103371-4cd7-44d9-8766-2ade2d0d375d, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=e82a0b0a-fa8a-4ea6-98e1-12794778865d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.476 143150 INFO neutron.agent.ovn.metadata.agent [-] Port e82a0b0a-fa8a-4ea6-98e1-12794778865d in datapath 875c8e70-f887-4bf0-ad2e-29e53ca07fc5 bound to our chassis#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.477 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 875c8e70-f887-4bf0-ad2e-29e53ca07fc5#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.487 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[4efd2c41-ce5d-40fe-b615-877969bda28a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.488 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap875c8e70-f1 in ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.489 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap875c8e70-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.489 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[7fa4aec1-2194-4fe3-a69b-c7151cf05ade]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.490 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[66c9928b-39b2-4d8f-9eb1-5c834a2e7659]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 systemd-udevd[231505]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:57:26 np0005591762 NetworkManager[48910]: <info>  [1769075846.5017] device (tape82a0b0a-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:57:26 np0005591762 NetworkManager[48910]: <info>  [1769075846.5021] device (tape82a0b0a-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.503 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[251fc6ed-20b3-4615-b159-22479f087671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 systemd-machined[193990]: New machine qemu-3-instance-00000008.
Jan 22 04:57:26 np0005591762 systemd[1]: Started Virtual Machine qemu-3-instance-00000008.
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.523 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[876c13b7-c4d9-40a7-abed-197770fb2683]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.543 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[57e9237a-67dd-479c-82ad-41d496885683]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 NetworkManager[48910]: <info>  [1769075846.5472] manager: (tap875c8e70-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.547 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[a44f07de-50a9-453b-b14b-a05d4bd6e715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:26Z|00058|binding|INFO|Setting lport e82a0b0a-fa8a-4ea6-98e1-12794778865d ovn-installed in OVS
Jan 22 04:57:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:26Z|00059|binding|INFO|Setting lport e82a0b0a-fa8a-4ea6-98e1-12794778865d up in Southbound
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.549 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.554 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.577 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[34460501-ce46-43f9-bb06-665aac03987e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.578 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7bbf72-c8ec-4d3d-929b-f747897c3b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 NetworkManager[48910]: <info>  [1769075846.5941] device (tap875c8e70-f0): carrier: link connected
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.597 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3ff8be-439a-4911-9afd-8d2419983ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.608 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[90914cd5-c9b9-4371-9819-56a4ab9aaa41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap875c8e70-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:85:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342415, 'reachable_time': 23229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231530, 'error': None, 'target': 'ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.619 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[96ae26fc-3211-480d-844c-31f7d3d4f72b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:85bf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 342415, 'tstamp': 342415}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231531, 'error': None, 'target': 'ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.631 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e185f9fa-3d1e-4a01-8c30-432800804148]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap875c8e70-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:85:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342415, 'reachable_time': 23229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231532, 'error': None, 'target': 'ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.650 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[d36f92d8-128a-49db-8e06-f1df2a68fd4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.683 225317 DEBUG nova.compute.manager [req-870ad46e-7cc1-4acc-84b5-2c427cf8e3e2 req-2cbbe456-bf64-493a-a8fa-f0bb11dd4406 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.683 225317 DEBUG oslo_concurrency.lockutils [req-870ad46e-7cc1-4acc-84b5-2c427cf8e3e2 req-2cbbe456-bf64-493a-a8fa-f0bb11dd4406 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.684 225317 DEBUG oslo_concurrency.lockutils [req-870ad46e-7cc1-4acc-84b5-2c427cf8e3e2 req-2cbbe456-bf64-493a-a8fa-f0bb11dd4406 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.684 225317 DEBUG oslo_concurrency.lockutils [req-870ad46e-7cc1-4acc-84b5-2c427cf8e3e2 req-2cbbe456-bf64-493a-a8fa-f0bb11dd4406 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.684 225317 DEBUG nova.compute.manager [req-870ad46e-7cc1-4acc-84b5-2c427cf8e3e2 req-2cbbe456-bf64-493a-a8fa-f0bb11dd4406 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Processing event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.692 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[b1461a5d-9577-4768-a51b-04a6560ee6d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.693 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap875c8e70-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.694 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.694 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap875c8e70-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:26 np0005591762 kernel: tap875c8e70-f0: entered promiscuous mode
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.695 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 NetworkManager[48910]: <info>  [1769075846.6976] manager: (tap875c8e70-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.697 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.699 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap875c8e70-f0, col_values=(('external_ids', {'iface-id': '6a8edc0a-12a7-4201-8060-8e8897c3f37a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.699 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:26Z|00060|binding|INFO|Releasing lport 6a8edc0a-12a7-4201-8060-8e8897c3f37a from this chassis (sb_readonly=0)
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.700 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.702 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/875c8e70-f887-4bf0-ad2e-29e53ca07fc5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/875c8e70-f887-4bf0-ad2e-29e53ca07fc5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.703 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[cf468865-b814-4270-a246-a615ec23e0ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.704 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-875c8e70-f887-4bf0-ad2e-29e53ca07fc5
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/875c8e70-f887-4bf0-ad2e-29e53ca07fc5.pid.haproxy
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID 875c8e70-f887-4bf0-ad2e-29e53ca07fc5
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 04:57:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:26.705 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'env', 'PROCESS_TAG=haproxy-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/875c8e70-f887-4bf0-ad2e-29e53ca07fc5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.713 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.827 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.827 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075846.8269534, 8e9afaf2-adf6-46e6-8c20-227ea75186b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.828 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] VM Started (Lifecycle Event)#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.831 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.833 225317 INFO nova.virt.libvirt.driver [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Instance spawned successfully.#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.833 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.844 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.847 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.850 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.851 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.851 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.851 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.852 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.852 225317 DEBUG nova.virt.libvirt.driver [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.871 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.871 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075846.827523, 8e9afaf2-adf6-46e6-8c20-227ea75186b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.872 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] VM Paused (Lifecycle Event)#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.892 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.894 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075846.830997, 8e9afaf2-adf6-46e6-8c20-227ea75186b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.894 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] VM Resumed (Lifecycle Event)#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.902 225317 INFO nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Took 4.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.902 225317 DEBUG nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.909 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.911 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:57:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.931 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.961 225317 INFO nova.compute.manager [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Took 4.73 seconds to build instance.#033[00m
Jan 22 04:57:26 np0005591762 nova_compute[225313]: 2026-01-22 09:57:26.972 225317 DEBUG oslo_concurrency.lockutils [None req-81c62e82-eb62-4b65-a469-78d37bab834c 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:26 np0005591762 podman[231602]: 2026-01-22 09:57:26.996613396 +0000 UTC m=+0.039945719 container create ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:57:27 np0005591762 systemd[1]: Started libpod-conmon-ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22.scope.
Jan 22 04:57:27 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:57:27 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/821d88c2dd41c3cb62f86f07bc868e592e1a5a0972f7547d04ef2e73f6f3dba0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:57:27 np0005591762 podman[231602]: 2026-01-22 09:57:27.060913075 +0000 UTC m=+0.104245417 container init ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 04:57:27 np0005591762 podman[231602]: 2026-01-22 09:57:27.066095669 +0000 UTC m=+0.109427991 container start ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:57:27 np0005591762 podman[231602]: 2026-01-22 09:57:26.980839117 +0000 UTC m=+0.024171459 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:57:27 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:27.073 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:27 np0005591762 neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5[231614]: [NOTICE]   (231618) : New worker (231620) forked
Jan 22 04:57:27 np0005591762 neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5[231614]: [NOTICE]   (231618) : Loading success.
Jan 22 04:57:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:27.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:28.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:28 np0005591762 nova_compute[225313]: 2026-01-22 09:57:28.731 225317 DEBUG nova.compute.manager [req-68ee7510-93bb-404f-b884-2665803455f3 req-866ba766-e5ca-479d-83a3-1b4fb1561d20 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:28 np0005591762 nova_compute[225313]: 2026-01-22 09:57:28.732 225317 DEBUG oslo_concurrency.lockutils [req-68ee7510-93bb-404f-b884-2665803455f3 req-866ba766-e5ca-479d-83a3-1b4fb1561d20 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:28 np0005591762 nova_compute[225313]: 2026-01-22 09:57:28.732 225317 DEBUG oslo_concurrency.lockutils [req-68ee7510-93bb-404f-b884-2665803455f3 req-866ba766-e5ca-479d-83a3-1b4fb1561d20 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:28 np0005591762 nova_compute[225313]: 2026-01-22 09:57:28.732 225317 DEBUG oslo_concurrency.lockutils [req-68ee7510-93bb-404f-b884-2665803455f3 req-866ba766-e5ca-479d-83a3-1b4fb1561d20 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:28 np0005591762 nova_compute[225313]: 2026-01-22 09:57:28.732 225317 DEBUG nova.compute.manager [req-68ee7510-93bb-404f-b884-2665803455f3 req-866ba766-e5ca-479d-83a3-1b4fb1561d20 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] No waiting events found dispatching network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:28 np0005591762 nova_compute[225313]: 2026-01-22 09:57:28.733 225317 WARNING nova.compute.manager [req-68ee7510-93bb-404f-b884-2665803455f3 req-866ba766-e5ca-479d-83a3-1b4fb1561d20 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received unexpected event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d for instance with vm_state active and task_state None.#033[00m
Jan 22 04:57:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:29.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:30.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:30 np0005591762 nova_compute[225313]: 2026-01-22 09:57:30.257 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:30 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:30Z|00061|binding|INFO|Releasing lport 6a8edc0a-12a7-4201-8060-8e8897c3f37a from this chassis (sb_readonly=0)
Jan 22 04:57:30 np0005591762 NetworkManager[48910]: <info>  [1769075850.3423] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 22 04:57:30 np0005591762 NetworkManager[48910]: <info>  [1769075850.3430] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 22 04:57:30 np0005591762 nova_compute[225313]: 2026-01-22 09:57:30.350 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:30 np0005591762 nova_compute[225313]: 2026-01-22 09:57:30.377 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:30 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:30Z|00062|binding|INFO|Releasing lport 6a8edc0a-12a7-4201-8060-8e8897c3f37a from this chassis (sb_readonly=0)
Jan 22 04:57:30 np0005591762 nova_compute[225313]: 2026-01-22 09:57:30.380 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:30 np0005591762 nova_compute[225313]: 2026-01-22 09:57:30.898 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.878 225317 DEBUG nova.compute.manager [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-changed-e82a0b0a-fa8a-4ea6-98e1-12794778865d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.878 225317 DEBUG nova.compute.manager [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Refreshing instance network info cache due to event network-changed-e82a0b0a-fa8a-4ea6-98e1-12794778865d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.878 225317 DEBUG oslo_concurrency.lockutils [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.879 225317 DEBUG oslo_concurrency.lockutils [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.879 225317 DEBUG nova.network.neutron [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Refreshing network info cache for port e82a0b0a-fa8a-4ea6-98e1-12794778865d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:57:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.999 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.999 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:31 np0005591762 nova_compute[225313]: 2026-01-22 09:57:31.999 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.000 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.000 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.000 225317 INFO nova.compute.manager [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Terminating instance#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.001 225317 DEBUG nova.compute.manager [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 04:57:32 np0005591762 kernel: tape82a0b0a-fa (unregistering): left promiscuous mode
Jan 22 04:57:32 np0005591762 NetworkManager[48910]: <info>  [1769075852.0231] device (tape82a0b0a-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.027 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:32Z|00063|binding|INFO|Releasing lport e82a0b0a-fa8a-4ea6-98e1-12794778865d from this chassis (sb_readonly=0)
Jan 22 04:57:32 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:32Z|00064|binding|INFO|Setting lport e82a0b0a-fa8a-4ea6-98e1-12794778865d down in Southbound
Jan 22 04:57:32 np0005591762 ovn_controller[133622]: 2026-01-22T09:57:32Z|00065|binding|INFO|Removing iface tape82a0b0a-fa ovn-installed in OVS
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.028 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.034 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:aa:80 10.100.0.11'], port_security=['fa:16:3e:b7:aa:80 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1829262121', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8e9afaf2-adf6-46e6-8c20-227ea75186b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1829262121', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '36f82e25-219e-420f-acf7-94f16329ca95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b103371-4cd7-44d9-8766-2ade2d0d375d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=e82a0b0a-fa8a-4ea6-98e1-12794778865d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.035 143150 INFO neutron.agent.ovn.metadata.agent [-] Port e82a0b0a-fa8a-4ea6-98e1-12794778865d in datapath 875c8e70-f887-4bf0-ad2e-29e53ca07fc5 unbound from our chassis#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.036 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 875c8e70-f887-4bf0-ad2e-29e53ca07fc5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.036 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1df496c7-b758-4cb8-835e-d0b35321ec31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.037 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5 namespace which is not needed anymore#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.047 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:32.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:32 np0005591762 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 22 04:57:32 np0005591762 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 5.535s CPU time.
Jan 22 04:57:32 np0005591762 systemd-machined[193990]: Machine qemu-3-instance-00000008 terminated.
Jan 22 04:57:32 np0005591762 neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5[231614]: [NOTICE]   (231618) : haproxy version is 2.8.14-c23fe91
Jan 22 04:57:32 np0005591762 neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5[231614]: [NOTICE]   (231618) : path to executable is /usr/sbin/haproxy
Jan 22 04:57:32 np0005591762 neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5[231614]: [ALERT]    (231618) : Current worker (231620) exited with code 143 (Terminated)
Jan 22 04:57:32 np0005591762 neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5[231614]: [WARNING]  (231618) : All workers exited. Exiting... (0)
Jan 22 04:57:32 np0005591762 systemd[1]: libpod-ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22.scope: Deactivated successfully.
Jan 22 04:57:32 np0005591762 conmon[231614]: conmon ead0944fd41c850f7925 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22.scope/container/memory.events
Jan 22 04:57:32 np0005591762 podman[231652]: 2026-01-22 09:57:32.136578023 +0000 UTC m=+0.035260362 container died ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 04:57:32 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22-userdata-shm.mount: Deactivated successfully.
Jan 22 04:57:32 np0005591762 systemd[1]: var-lib-containers-storage-overlay-821d88c2dd41c3cb62f86f07bc868e592e1a5a0972f7547d04ef2e73f6f3dba0-merged.mount: Deactivated successfully.
Jan 22 04:57:32 np0005591762 podman[231652]: 2026-01-22 09:57:32.15668789 +0000 UTC m=+0.055370228 container cleanup ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:57:32 np0005591762 systemd[1]: libpod-conmon-ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22.scope: Deactivated successfully.
Jan 22 04:57:32 np0005591762 podman[231675]: 2026-01-22 09:57:32.197827819 +0000 UTC m=+0.023286639 container remove ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.201 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbf9eb0-227d-409c-ab19-281b6cbf3ca2]: (4, ('Thu Jan 22 09:57:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5 (ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22)\nead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22\nThu Jan 22 09:57:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5 (ead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22)\nead0944fd41c850f79253cb28ea2e0184140ea45d0e5bb564ce091c8d92f8a22\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.202 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4d3aa3-0cac-486b-8d52-ed1d3542fdce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.203 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap875c8e70-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.204 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 kernel: tap875c8e70-f0: left promiscuous mode
Jan 22 04:57:32 np0005591762 NetworkManager[48910]: <info>  [1769075852.2206] manager: (tape82a0b0a-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.220 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.224 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[8119a0b6-f78c-40f9-8d64-2225bdaf09ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.231 225317 INFO nova.virt.libvirt.driver [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Instance destroyed successfully.#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.231 225317 DEBUG nova.objects.instance [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'resources' on Instance uuid 8e9afaf2-adf6-46e6-8c20-227ea75186b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.233 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[bae34056-e197-4a92-be69-8736471c9341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.234 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[fabd18a2-6157-494b-9276-e6858aded008]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.241 225317 DEBUG nova.virt.libvirt.vif [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:57:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-848748631',display_name='tempest-TestNetworkBasicOps-server-848748631',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-848748631',id=8,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKH+U724EaOKBknPaXf2RKgtd8DVtla/9cHasfvf1usw/C9M7fa/JShiAffn9j+2oIvZpWu8+s44VRhSwREi0UFAiqLCqWelOq+C+U2sTsmKXL58Y70x90AVi/BRbYQiBQ==',key_name='tempest-TestNetworkBasicOps-26159022',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:57:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-e4sn1fm8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:57:26Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=8e9afaf2-adf6-46e6-8c20-227ea75186b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.242 225317 DEBUG nova.network.os_vif_util [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.242 225317 DEBUG nova.network.os_vif_util [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.242 225317 DEBUG os_vif [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.244 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.244 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape82a0b0a-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.247 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.247 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.249 225317 INFO os_vif [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:aa:80,bridge_name='br-int',has_traffic_filtering=True,id=e82a0b0a-fa8a-4ea6-98e1-12794778865d,network=Network(875c8e70-f887-4bf0-ad2e-29e53ca07fc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape82a0b0a-fa')#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.248 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[2dbbee80-148c-42d6-b966-4286e715c1f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342409, 'reachable_time': 40645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231696, 'error': None, 'target': 'ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 systemd[1]: run-netns-ovnmeta\x2d875c8e70\x2df887\x2d4bf0\x2dad2e\x2d29e53ca07fc5.mount: Deactivated successfully.
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.251 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-875c8e70-f887-4bf0-ad2e-29e53ca07fc5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 04:57:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:32.251 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[e2335631-2285-4122-993e-c16324215157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.399 225317 INFO nova.virt.libvirt.driver [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Deleting instance files /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1_del#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.400 225317 INFO nova.virt.libvirt.driver [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Deletion of /var/lib/nova/instances/8e9afaf2-adf6-46e6-8c20-227ea75186b1_del complete#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.434 225317 INFO nova.compute.manager [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.434 225317 DEBUG oslo.service.loopingcall [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.434 225317 DEBUG nova.compute.manager [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 04:57:32 np0005591762 nova_compute[225313]: 2026-01-22 09:57:32.435 225317 DEBUG nova.network.neutron [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 04:57:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:33.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.301 225317 DEBUG nova.network.neutron [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Updated VIF entry in instance network info cache for port e82a0b0a-fa8a-4ea6-98e1-12794778865d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.301 225317 DEBUG nova.network.neutron [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Updating instance_info_cache with network_info: [{"id": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "address": "fa:16:3e:b7:aa:80", "network": {"id": "875c8e70-f887-4bf0-ad2e-29e53ca07fc5", "bridge": "br-int", "label": "tempest-network-smoke--1038756392", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82a0b0a-fa", "ovs_interfaceid": "e82a0b0a-fa8a-4ea6-98e1-12794778865d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.317 225317 DEBUG oslo_concurrency.lockutils [req-e44ee9a8-916a-4846-85c2-a9ee302146b6 req-50052665-99b8-4c35-bf81-be62b2caacd7 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-8e9afaf2-adf6-46e6-8c20-227ea75186b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:57:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.936 225317 DEBUG nova.compute.manager [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-vif-unplugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG oslo_concurrency.lockutils [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG oslo_concurrency.lockutils [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG oslo_concurrency.lockutils [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG nova.compute.manager [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] No waiting events found dispatching network-vif-unplugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG nova.compute.manager [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-vif-unplugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG nova.compute.manager [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.937 225317 DEBUG oslo_concurrency.lockutils [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.938 225317 DEBUG oslo_concurrency.lockutils [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.938 225317 DEBUG oslo_concurrency.lockutils [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.938 225317 DEBUG nova.compute.manager [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] No waiting events found dispatching network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:57:33 np0005591762 nova_compute[225313]: 2026-01-22 09:57:33.938 225317 WARNING nova.compute.manager [req-a798b41e-e6a1-4f27-b560-0d2088ea5695 req-b1a19791-42f3-475c-adc3-09bc3a3c1237 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Received unexpected event network-vif-plugged-e82a0b0a-fa8a-4ea6-98e1-12794778865d for instance with vm_state active and task_state deleting.#033[00m
Jan 22 04:57:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:34.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.334 225317 DEBUG nova.network.neutron [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.343 225317 INFO nova.compute.manager [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Took 1.91 seconds to deallocate network for instance.#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.380 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.381 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.416 225317 DEBUG oslo_concurrency.processutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:57:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:57:34 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3468567811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.764 225317 DEBUG oslo_concurrency.processutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.348s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.768 225317 DEBUG nova.compute.provider_tree [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.780 225317 DEBUG nova.scheduler.client.report [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.794 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.810 225317 INFO nova.scheduler.client.report [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Deleted allocations for instance 8e9afaf2-adf6-46e6-8c20-227ea75186b1#033[00m
Jan 22 04:57:34 np0005591762 nova_compute[225313]: 2026-01-22 09:57:34.850 225317 DEBUG oslo_concurrency.lockutils [None req-5b18a842-a1ac-4412-bc02-f4dac5e977c0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "8e9afaf2-adf6-46e6-8c20-227ea75186b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:35.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:35 np0005591762 nova_compute[225313]: 2026-01-22 09:57:35.258 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:36.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:37.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:37 np0005591762 nova_compute[225313]: 2026-01-22 09:57:37.245 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:37 np0005591762 podman[231771]: 2026-01-22 09:57:37.818016669 +0000 UTC m=+0.039911282 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 04:57:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:38.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:57:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:39.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:57:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:40.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:40 np0005591762 nova_compute[225313]: 2026-01-22 09:57:40.259 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:41.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:42.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:42 np0005591762 nova_compute[225313]: 2026-01-22 09:57:42.247 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:43.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:44.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:45.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:45 np0005591762 nova_compute[225313]: 2026-01-22 09:57:45.261 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:45 np0005591762 podman[231796]: 2026-01-22 09:57:45.844170623 +0000 UTC m=+0.067201442 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 04:57:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:46.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:47.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:47.200 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:57:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:47.201 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:57:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:57:47.201 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:57:47 np0005591762 nova_compute[225313]: 2026-01-22 09:57:47.230 225317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769075852.2291849, 8e9afaf2-adf6-46e6-8c20-227ea75186b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:57:47 np0005591762 nova_compute[225313]: 2026-01-22 09:57:47.230 225317 INFO nova.compute.manager [-] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] VM Stopped (Lifecycle Event)#033[00m
Jan 22 04:57:47 np0005591762 nova_compute[225313]: 2026-01-22 09:57:47.247 225317 DEBUG nova.compute.manager [None req-f27d85f4-da7b-47a3-b3cc-f1afa31b2e55 - - - - - -] [instance: 8e9afaf2-adf6-46e6-8c20-227ea75186b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:57:47 np0005591762 nova_compute[225313]: 2026-01-22 09:57:47.248 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:48.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:49.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:50.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:50 np0005591762 nova_compute[225313]: 2026-01-22 09:57:50.262 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:51.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:52.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:52 np0005591762 nova_compute[225313]: 2026-01-22 09:57:52.249 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:53.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:54.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:55.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:55 np0005591762 nova_compute[225313]: 2026-01-22 09:57:55.263 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:56.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:57:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:57.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:57 np0005591762 nova_compute[225313]: 2026-01-22 09:57:57.250 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:57:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:57:58.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:57:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:57:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:57:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:57:59.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:57:59 np0005591762 nova_compute[225313]: 2026-01-22 09:57:59.512 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:59 np0005591762 nova_compute[225313]: 2026-01-22 09:57:59.595 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:57:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:57:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:57:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:57:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:00.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:00 np0005591762 nova_compute[225313]: 2026-01-22 09:58:00.265 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:01.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:02.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:02 np0005591762 nova_compute[225313]: 2026-01-22 09:58:02.251 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:03.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:04.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:05.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:05 np0005591762 nova_compute[225313]: 2026-01-22 09:58:05.266 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:06.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:06 np0005591762 nova_compute[225313]: 2026-01-22 09:58:06.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:06 np0005591762 nova_compute[225313]: 2026-01-22 09:58:06.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 04:58:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:07.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:07 np0005591762 nova_compute[225313]: 2026-01-22 09:58:07.252 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:08.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [WARNING] 021/095808 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 22 04:58:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [NOTICE] 021/095808 (4) : haproxy version is 2.3.17-d1c9119
Jan 22 04:58:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [NOTICE] 021/095808 (4) : path to executable is /usr/local/sbin/haproxy
Jan 22 04:58:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-nfs-cephfs-compute-2-uczfqf[88292]: [ALERT] 021/095808 (4) : backend 'backend' has no server available!
Jan 22 04:58:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:08 np0005591762 nova_compute[225313]: 2026-01-22 09:58:08.732 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:08 np0005591762 nova_compute[225313]: 2026-01-22 09:58:08.733 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 04:58:08 np0005591762 nova_compute[225313]: 2026-01-22 09:58:08.746 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 04:58:08 np0005591762 podman[231869]: 2026-01-22 09:58:08.819003596 +0000 UTC m=+0.038017692 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 22 04:58:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:09.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.190 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.191 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.202 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.255 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.255 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.259 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.260 225317 INFO nova.compute.claims [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.324 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:58:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1745392906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.666 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.343s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.670 225317 DEBUG nova.compute.provider_tree [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:58:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.688 225317 DEBUG nova.scheduler.client.report [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.703 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.703 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.731 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.732 225317 DEBUG nova.network.neutron [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.744 225317 INFO nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.753 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.809 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.809 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.810 225317 INFO nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Creating image(s)#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.830 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.846 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.862 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.864 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.909 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.910 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "9db187949728ea707722fd244d769f131efa8688" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.910 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.911 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.926 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:09 np0005591762 nova_compute[225313]: 2026-01-22 09:58:09.928 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 3a47d151-726b-45ba-a05b-5370ac89942a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.042 225317 DEBUG nova.policy [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4428dd9b0fb64c25b8f33b0050d4ef6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.056 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 3a47d151-726b-45ba-a05b-5370ac89942a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.097 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] resizing rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 22 04:58:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:10.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.146 225317 DEBUG nova.objects.instance [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'migration_context' on Instance uuid 3a47d151-726b-45ba-a05b-5370ac89942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.156 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.157 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Ensure instance console log exists: /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.157 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.157 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.158 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.266 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:10 np0005591762 nova_compute[225313]: 2026-01-22 09:58:10.501 225317 DEBUG nova.network.neutron [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Successfully created port: c8d82fa4-0662-4cff-a072-e825567a344e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 04:58:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:11.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.271 225317 DEBUG nova.network.neutron [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Successfully updated port: c8d82fa4-0662-4cff-a072-e825567a344e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.284 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.284 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.284 225317 DEBUG nova.network.neutron [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.340 225317 DEBUG nova.compute.manager [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-changed-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.340 225317 DEBUG nova.compute.manager [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Refreshing instance network info cache due to event network-changed-c8d82fa4-0662-4cff-a072-e825567a344e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.340 225317 DEBUG oslo_concurrency.lockutils [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.394 225317 DEBUG nova.network.neutron [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 04:58:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.732 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:11 np0005591762 nova_compute[225313]: 2026-01-22 09:58:11.732 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:12.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.183 225317 DEBUG nova.network.neutron [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updating instance_info_cache with network_info: [{"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.198 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.198 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Instance network_info: |[{"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.198 225317 DEBUG oslo_concurrency.lockutils [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.198 225317 DEBUG nova.network.neutron [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Refreshing network info cache for port c8d82fa4-0662-4cff-a072-e825567a344e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.201 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Start _get_guest_xml network_info=[{"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'encryption_options': None, 'image_id': 'bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.204 225317 WARNING nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.208 225317 DEBUG nova.virt.libvirt.host [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.208 225317 DEBUG nova.virt.libvirt.host [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.212 225317 DEBUG nova.virt.libvirt.host [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.213 225317 DEBUG nova.virt.libvirt.host [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.213 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.213 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T09:51:32Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6eff66ba-fb3e-4ca7-b05b-920b01d9affd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.214 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.214 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.214 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.214 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.214 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.214 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.215 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.215 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.215 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.215 225317 DEBUG nova.virt.hardware [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.217 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.253 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.567 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.350s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.592 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.596 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.724 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.951 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.952 225317 DEBUG nova.virt.libvirt.vif [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:58:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2060623440',display_name='tempest-TestNetworkBasicOps-server-2060623440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2060623440',id=10,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXMmkM1dFZ2CyXD6dD2L52v4NoYUb6W+vn5jZeCwXvCRMkq0BkS6TaO0bPPnGuHZNi1RtK7TGfFqGIY4B7tQIv0qWMij9YX9X/riVenKrExJRQsOCH+fb6DIatGWuLzuw==',key_name='tempest-TestNetworkBasicOps-1280118144',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-d801a861',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:58:09Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=3a47d151-726b-45ba-a05b-5370ac89942a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.952 225317 DEBUG nova.network.os_vif_util [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.953 225317 DEBUG nova.network.os_vif_util [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.954 225317 DEBUG nova.objects.instance [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a47d151-726b-45ba-a05b-5370ac89942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.965 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <uuid>3a47d151-726b-45ba-a05b-5370ac89942a</uuid>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <name>instance-0000000a</name>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <memory>131072</memory>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <vcpu>1</vcpu>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:name>tempest-TestNetworkBasicOps-server-2060623440</nova:name>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:creationTime>2026-01-22 09:58:12</nova:creationTime>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:flavor name="m1.nano">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:memory>128</nova:memory>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:disk>1</nova:disk>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:swap>0</nova:swap>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:vcpus>1</nova:vcpus>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </nova:flavor>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:owner>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </nova:owner>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <nova:ports>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <nova:port uuid="c8d82fa4-0662-4cff-a072-e825567a344e">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        </nova:port>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </nova:ports>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </nova:instance>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <sysinfo type="smbios">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <entry name="manufacturer">RDO</entry>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <entry name="product">OpenStack Compute</entry>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <entry name="serial">3a47d151-726b-45ba-a05b-5370ac89942a</entry>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <entry name="uuid">3a47d151-726b-45ba-a05b-5370ac89942a</entry>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <entry name="family">Virtual Machine</entry>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <boot dev="hd"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <smbios mode="sysinfo"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <vmcoreinfo/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <clock offset="utc">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <timer name="hpet" present="no"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <cpu mode="host-model" match="exact">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <disk type="network" device="disk">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/3a47d151-726b-45ba-a05b-5370ac89942a_disk">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <target dev="vda" bus="virtio"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <disk type="network" device="cdrom">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/3a47d151-726b-45ba-a05b-5370ac89942a_disk.config">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <target dev="sda" bus="sata"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <interface type="ethernet">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <mac address="fa:16:3e:4b:e2:e8"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <mtu size="1442"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <target dev="tapc8d82fa4-06"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <serial type="pty">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <log file="/var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/console.log" append="off"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <input type="tablet" bus="usb"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <rng model="virtio">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <backend model="random">/dev/urandom</backend>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <controller type="usb" index="0"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    <memballoon model="virtio">
Jan 22 04:58:12 np0005591762 nova_compute[225313]:      <stats period="10"/>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:58:12 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:58:12 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:58:12 np0005591762 nova_compute[225313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.966 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Preparing to wait for external event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.967 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.967 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.967 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.967 225317 DEBUG nova.virt.libvirt.vif [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:58:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2060623440',display_name='tempest-TestNetworkBasicOps-server-2060623440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2060623440',id=10,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXMmkM1dFZ2CyXD6dD2L52v4NoYUb6W+vn5jZeCwXvCRMkq0BkS6TaO0bPPnGuHZNi1RtK7TGfFqGIY4B7tQIv0qWMij9YX9X/riVenKrExJRQsOCH+fb6DIatGWuLzuw==',key_name='tempest-TestNetworkBasicOps-1280118144',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-d801a861',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:58:09Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=3a47d151-726b-45ba-a05b-5370ac89942a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.968 225317 DEBUG nova.network.os_vif_util [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.968 225317 DEBUG nova.network.os_vif_util [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.968 225317 DEBUG os_vif [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.969 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.969 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.970 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.972 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.972 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8d82fa4-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.972 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8d82fa4-06, col_values=(('external_ids', {'iface-id': 'c8d82fa4-0662-4cff-a072-e825567a344e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:e2:e8', 'vm-uuid': '3a47d151-726b-45ba-a05b-5370ac89942a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.973 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:12 np0005591762 NetworkManager[48910]: <info>  [1769075892.9746] manager: (tapc8d82fa4-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.977 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.978 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:12 np0005591762 nova_compute[225313]: 2026-01-22 09:58:12.978 225317 INFO os_vif [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06')#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.013 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.013 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.014 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:4b:e2:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.014 225317 INFO nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Using config drive#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.032 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.096 225317 DEBUG nova.network.neutron [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updated VIF entry in instance network info cache for port c8d82fa4-0662-4cff-a072-e825567a344e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.097 225317 DEBUG nova.network.neutron [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updating instance_info_cache with network_info: [{"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.110 225317 DEBUG oslo_concurrency.lockutils [req-20eb57cd-f9ab-4cc0-b68b-0bc3f455a174 req-4133f7fc-b849-4e1a-ac63-9582f818ca71 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:58:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:58:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:13.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.255 225317 INFO nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Creating config drive at /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/disk.config#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.260 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpencz1uyk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.378 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpencz1uyk" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.399 225317 DEBUG nova.storage.rbd_utils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 3a47d151-726b-45ba-a05b-5370ac89942a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.402 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/disk.config 3a47d151-726b-45ba-a05b-5370ac89942a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.466367) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075893466391, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1750, "num_deletes": 255, "total_data_size": 4266299, "memory_usage": 4331104, "flush_reason": "Manual Compaction"}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075893472542, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 2786906, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23451, "largest_seqno": 25196, "table_properties": {"data_size": 2779808, "index_size": 4041, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 14985, "raw_average_key_size": 19, "raw_value_size": 2765369, "raw_average_value_size": 3582, "num_data_blocks": 179, "num_entries": 772, "num_filter_entries": 772, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769075745, "oldest_key_time": 1769075745, "file_creation_time": 1769075893, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 6211 microseconds, and 4280 cpu microseconds.
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.472578) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 2786906 bytes OK
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.472589) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.472973) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.472984) EVENT_LOG_v1 {"time_micros": 1769075893472981, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.472993) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 4258306, prev total WAL file size 4258306, number of live WAL files 2.
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.473648) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323530' seq:72057594037927935, type:22 .. '6C6F676D00353031' seq:0, type:0; will stop at (end)
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(2721KB)], [42(13MB)]
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075893473685, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 17243917, "oldest_snapshot_seqno": -1}
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.488 225317 DEBUG oslo_concurrency.processutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/disk.config 3a47d151-726b-45ba-a05b-5370ac89942a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.489 225317 INFO nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Deleting local config drive /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a/disk.config because it was imported into RBD.#033[00m
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5796 keys, 17049878 bytes, temperature: kUnknown
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075893513044, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 17049878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17007321, "index_size": 26959, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14533, "raw_key_size": 146329, "raw_average_key_size": 25, "raw_value_size": 16898720, "raw_average_value_size": 2915, "num_data_blocks": 1112, "num_entries": 5796, "num_filter_entries": 5796, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769075893, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.513319) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 17049878 bytes
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.513762) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 436.3 rd, 431.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 13.8 +0.0 blob) out(16.3 +0.0 blob), read-write-amplify(12.3) write-amplify(6.1) OK, records in: 6320, records dropped: 524 output_compression: NoCompression
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.513778) EVENT_LOG_v1 {"time_micros": 1769075893513770, "job": 24, "event": "compaction_finished", "compaction_time_micros": 39524, "compaction_time_cpu_micros": 25024, "output_level": 6, "num_output_files": 1, "total_output_size": 17049878, "num_input_records": 6320, "num_output_records": 5796, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075893514467, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075893516538, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.473594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.516648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.516653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.516655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.516656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:13.516657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:13 np0005591762 kernel: tapc8d82fa4-06: entered promiscuous mode
Jan 22 04:58:13 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:13Z|00066|binding|INFO|Claiming lport c8d82fa4-0662-4cff-a072-e825567a344e for this chassis.
Jan 22 04:58:13 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:13Z|00067|binding|INFO|c8d82fa4-0662-4cff-a072-e825567a344e: Claiming fa:16:3e:4b:e2:e8 10.100.0.9
Jan 22 04:58:13 np0005591762 NetworkManager[48910]: <info>  [1769075893.5279] manager: (tapc8d82fa4-06): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.528 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.534 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:e2:e8 10.100.0.9'], port_security=['fa:16:3e:4b:e2:e8 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3a47d151-726b-45ba-a05b-5370ac89942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4421835f-206a-4ba7-9834-b1f735d99d9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4fb1cf86-fd68-4d5b-bf1d-3427a3fd17ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=32398acc-7b78-499b-93d7-aab150acf0e9, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=c8d82fa4-0662-4cff-a072-e825567a344e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.535 143150 INFO neutron.agent.ovn.metadata.agent [-] Port c8d82fa4-0662-4cff-a072-e825567a344e in datapath 4421835f-206a-4ba7-9834-b1f735d99d9b bound to our chassis#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.536 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4421835f-206a-4ba7-9834-b1f735d99d9b#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.544 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d0afd3-799e-4ec9-800f-c67a8300c4a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.545 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4421835f-21 in ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.546 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4421835f-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.546 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[4378b4a0-920a-4ab1-b226-d1297cb76b52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.547 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[aabdf513-a78a-4130-9999-8c1fa10683d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 systemd-udevd[232240]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:58:13 np0005591762 systemd-machined[193990]: New machine qemu-4-instance-0000000a.
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.559 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[81ed729e-180e-4530-81dd-afae31b44047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 NetworkManager[48910]: <info>  [1769075893.5641] device (tapc8d82fa4-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:58:13 np0005591762 NetworkManager[48910]: <info>  [1769075893.5648] device (tapc8d82fa4-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.583 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[9834d88c-987c-41a6-bccc-1f824062416c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.596 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:13 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:13Z|00068|binding|INFO|Setting lport c8d82fa4-0662-4cff-a072-e825567a344e ovn-installed in OVS
Jan 22 04:58:13 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:13Z|00069|binding|INFO|Setting lport c8d82fa4-0662-4cff-a072-e825567a344e up in Southbound
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.601 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.611 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[f4541d1c-491d-480b-af14-e1bce8a693ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.614 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[552dec03-99af-42b4-8bd8-e78193eb511a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 NetworkManager[48910]: <info>  [1769075893.6148] manager: (tap4421835f-20): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Jan 22 04:58:13 np0005591762 systemd-udevd[232243]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.638 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8b283d-fc8c-4c77-b417-7a71f127d4b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.641 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[b683a5ba-63e0-4709-a552-38298018a2f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 NetworkManager[48910]: <info>  [1769075893.6574] device (tap4421835f-20): carrier: link connected
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.661 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ff19c4-e6de-4974-b51e-3addd06f1b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.673 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa1445f-6f30-4c62-9ca0-32daa2dd5660]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4421835f-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:01:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347121, 'reachable_time': 32424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232264, 'error': None, 'target': 'ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.684 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[ae789823-c328-46e8-a4ef-d857644e056c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:1c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347121, 'tstamp': 347121}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232265, 'error': None, 'target': 'ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.696 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[cd701066-077f-4bbc-8aba-f2164cd2e118]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4421835f-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:01:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347121, 'reachable_time': 32424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232266, 'error': None, 'target': 'ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.717 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[0162199b-34ff-4724-a64a-85eac3ab88e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.721 225317 DEBUG nova.compute.manager [req-cd0a1921-f505-40e9-9bfb-a57ab2796775 req-fdf12b16-1576-4959-8926-3a3c06d5416c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.721 225317 DEBUG oslo_concurrency.lockutils [req-cd0a1921-f505-40e9-9bfb-a57ab2796775 req-fdf12b16-1576-4959-8926-3a3c06d5416c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.721 225317 DEBUG oslo_concurrency.lockutils [req-cd0a1921-f505-40e9-9bfb-a57ab2796775 req-fdf12b16-1576-4959-8926-3a3c06d5416c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.722 225317 DEBUG oslo_concurrency.lockutils [req-cd0a1921-f505-40e9-9bfb-a57ab2796775 req-fdf12b16-1576-4959-8926-3a3c06d5416c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.722 225317 DEBUG nova.compute.manager [req-cd0a1921-f505-40e9-9bfb-a57ab2796775 req-fdf12b16-1576-4959-8926-3a3c06d5416c e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Processing event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.737 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.737 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.738 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.757 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1d606c18-a7df-4a7b-9b5b-2ec9b42edcfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.758 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4421835f-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.758 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.759 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4421835f-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:13 np0005591762 NetworkManager[48910]: <info>  [1769075893.7608] manager: (tap4421835f-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.760 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:13 np0005591762 kernel: tap4421835f-20: entered promiscuous mode
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.762 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.765 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4421835f-20, col_values=(('external_ids', {'iface-id': '7499991b-91f3-4a9c-9f5f-a9ef20823f92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:13 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:13Z|00070|binding|INFO|Releasing lport 7499991b-91f3-4a9c-9f5f-a9ef20823f92 from this chassis (sb_readonly=0)
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.766 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.769 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4421835f-206a-4ba7-9834-b1f735d99d9b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4421835f-206a-4ba7-9834-b1f735d99d9b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.770 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[f8313504-6bd2-493a-8971-9068c3fc9c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.770 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-4421835f-206a-4ba7-9834-b1f735d99d9b
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/4421835f-206a-4ba7-9834-b1f735d99d9b.pid.haproxy
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID 4421835f-206a-4ba7-9834-b1f735d99d9b
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 04:58:13 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:13.772 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b', 'env', 'PROCESS_TAG=haproxy-4421835f-206a-4ba7-9834-b1f735d99d9b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4421835f-206a-4ba7-9834-b1f735d99d9b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 04:58:13 np0005591762 nova_compute[225313]: 2026-01-22 09:58:13.779 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:14 np0005591762 podman[232295]: 2026-01-22 09:58:14.052326223 +0000 UTC m=+0.030647262 container create 907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 04:58:14 np0005591762 systemd[1]: Started libpod-conmon-907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9.scope.
Jan 22 04:58:14 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:58:14 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e4d08272d58c01d7a81fa8a399044c67e4a0f98d989dd4527cfad509c4717b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:58:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:14.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:14 np0005591762 podman[232295]: 2026-01-22 09:58:14.118476972 +0000 UTC m=+0.096798021 container init 907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 04:58:14 np0005591762 podman[232295]: 2026-01-22 09:58:14.122284042 +0000 UTC m=+0.100605081 container start 907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 04:58:14 np0005591762 podman[232295]: 2026-01-22 09:58:14.038598422 +0000 UTC m=+0.016919461 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:58:14 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [NOTICE]   (232311) : New worker (232313) forked
Jan 22 04:58:14 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [NOTICE]   (232311) : Loading success.
Jan 22 04:58:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:14 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:14 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:14 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:58:14 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:14 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.997 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075894.996471, 3a47d151-726b-45ba-a05b-5370ac89942a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:58:14 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.997 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] VM Started (Lifecycle Event)#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:14.999 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.002 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.004 225317 INFO nova.virt.libvirt.driver [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Instance spawned successfully.#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.005 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.017 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.019 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.024 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.024 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.024 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.025 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.025 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.025 225317 DEBUG nova.virt.libvirt.driver [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.043 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.043 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075894.9967282, 3a47d151-726b-45ba-a05b-5370ac89942a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.043 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] VM Paused (Lifecycle Event)#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.061 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.063 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075895.0026333, 3a47d151-726b-45ba-a05b-5370ac89942a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.064 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] VM Resumed (Lifecycle Event)#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.070 225317 INFO nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Took 5.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.070 225317 DEBUG nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.075 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.077 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.097 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.116 225317 INFO nova.compute.manager [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Took 5.88 seconds to build instance.#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.127 225317 DEBUG oslo_concurrency.lockutils [None req-1aef0d53-e364-4203-a22f-e435ea8c1897 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:15.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.268 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.777 225317 DEBUG nova.compute.manager [req-5ca0c2b2-1e2b-465c-a4c7-e89c58733fe8 req-2d221e56-cd98-4088-b788-5cb2959a9c94 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.777 225317 DEBUG oslo_concurrency.lockutils [req-5ca0c2b2-1e2b-465c-a4c7-e89c58733fe8 req-2d221e56-cd98-4088-b788-5cb2959a9c94 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.777 225317 DEBUG oslo_concurrency.lockutils [req-5ca0c2b2-1e2b-465c-a4c7-e89c58733fe8 req-2d221e56-cd98-4088-b788-5cb2959a9c94 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.777 225317 DEBUG oslo_concurrency.lockutils [req-5ca0c2b2-1e2b-465c-a4c7-e89c58733fe8 req-2d221e56-cd98-4088-b788-5cb2959a9c94 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.778 225317 DEBUG nova.compute.manager [req-5ca0c2b2-1e2b-465c-a4c7-e89c58733fe8 req-2d221e56-cd98-4088-b788-5cb2959a9c94 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] No waiting events found dispatching network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:58:15 np0005591762 nova_compute[225313]: 2026-01-22 09:58:15.778 225317 WARNING nova.compute.manager [req-5ca0c2b2-1e2b-465c-a4c7-e89c58733fe8 req-2d221e56-cd98-4088-b788-5cb2959a9c94 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received unexpected event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e for instance with vm_state active and task_state None.#033[00m
Jan 22 04:58:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:16.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:16 np0005591762 nova_compute[225313]: 2026-01-22 09:58:16.732 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:16 np0005591762 podman[232363]: 2026-01-22 09:58:16.84564523 +0000 UTC m=+0.062446404 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 04:58:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:58:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:17.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:58:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:17 np0005591762 nova_compute[225313]: 2026-01-22 09:58:17.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:17 np0005591762 nova_compute[225313]: 2026-01-22 09:58:17.975 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:18.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:18 np0005591762 nova_compute[225313]: 2026-01-22 09:58:18.250 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:18 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:18.248 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:58:18 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:18.249 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:58:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:19.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:19 np0005591762 NetworkManager[48910]: <info>  [1769075899.2771] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 22 04:58:19 np0005591762 NetworkManager[48910]: <info>  [1769075899.2778] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 22 04:58:19 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:19Z|00071|binding|INFO|Releasing lport 7499991b-91f3-4a9c-9f5f-a9ef20823f92 from this chassis (sb_readonly=0)
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.275 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.313 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:19 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:19Z|00072|binding|INFO|Releasing lport 7499991b-91f3-4a9c-9f5f-a9ef20823f92 from this chassis (sb_readonly=0)
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.318 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.448 225317 DEBUG nova.compute.manager [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-changed-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.449 225317 DEBUG nova.compute.manager [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Refreshing instance network info cache due to event network-changed-c8d82fa4-0662-4cff-a072-e825567a344e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.449 225317 DEBUG oslo_concurrency.lockutils [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.449 225317 DEBUG oslo_concurrency.lockutils [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.449 225317 DEBUG nova.network.neutron [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Refreshing network info cache for port c8d82fa4-0662-4cff-a072-e825567a344e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:58:19 np0005591762 podman[232497]: 2026-01-22 09:58:19.56069967 +0000 UTC m=+0.044340176 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 22 04:58:19 np0005591762 podman[232497]: 2026-01-22 09:58:19.643574752 +0000 UTC m=+0.127215257 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 22 04:58:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.744 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.744 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.745 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:58:19 np0005591762 nova_compute[225313]: 2026-01-22 09:58:19.745 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:20 np0005591762 podman[232628]: 2026-01-22 09:58:20.057552336 +0000 UTC m=+0.040577059 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:58:20 np0005591762 podman[232628]: 2026-01-22 09:58:20.065624129 +0000 UTC m=+0.048648851 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 04:58:20 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:58:20 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2135612296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:58:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:20.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.132 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.387s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.186 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.187 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.270 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.299 225317 DEBUG nova.network.neutron [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updated VIF entry in instance network info cache for port c8d82fa4-0662-4cff-a072-e825567a344e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.300 225317 DEBUG nova.network.neutron [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updating instance_info_cache with network_info: [{"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.314 225317 DEBUG oslo_concurrency.lockutils [req-dae74385-db9e-442c-a9df-2057b76ed681 req-4bf51c9b-48eb-4c92-84bd-5846f04ca933 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:58:20 np0005591762 podman[232700]: 2026-01-22 09:58:20.343349053 +0000 UTC m=+0.057441134 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:58:20 np0005591762 podman[232700]: 2026-01-22 09:58:20.359601224 +0000 UTC m=+0.073693305 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.516 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.517 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4684MB free_disk=59.96738052368164GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.517 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.517 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:20 np0005591762 podman[232752]: 2026-01-22 09:58:20.573285398 +0000 UTC m=+0.049667302 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, version=2.2.4, com.redhat.component=keepalived-container)
Jan 22 04:58:20 np0005591762 podman[232752]: 2026-01-22 09:58:20.5904341 +0000 UTC m=+0.066816005 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, vcs-type=git, version=2.2.4, io.buildah.version=1.28.2, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.608 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Instance 3a47d151-726b-45ba-a05b-5370ac89942a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.609 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.609 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.650 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing inventories for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 04:58:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.696 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating ProviderTree inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.697 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.706 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing aggregate associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.721 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing trait associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, traits: HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX512VAES,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AESNI,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 04:58:20 np0005591762 nova_compute[225313]: 2026-01-22 09:58:20.746 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3592531944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:58:21 np0005591762 nova_compute[225313]: 2026-01-22 09:58:21.129 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.383s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:21 np0005591762 nova_compute[225313]: 2026-01-22 09:58:21.134 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:58:21 np0005591762 nova_compute[225313]: 2026-01-22 09:58:21.147 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:58:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:21.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:21 np0005591762 nova_compute[225313]: 2026-01-22 09:58:21.160 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:58:21 np0005591762 nova_compute[225313]: 2026-01-22 09:58:21.161 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:58:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:22.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:22 np0005591762 nova_compute[225313]: 2026-01-22 09:58:22.978 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:23.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:24.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:25.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:25 np0005591762 nova_compute[225313]: 2026-01-22 09:58:25.271 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:26.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:58:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:26Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4b:e2:e8 10.100.0.9
Jan 22 04:58:26 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:26Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4b:e2:e8 10.100.0.9
Jan 22 04:58:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:27.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:27 np0005591762 nova_compute[225313]: 2026-01-22 09:58:27.981 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:28.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:28 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:28.251 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.475918) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075908475962, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 492, "num_deletes": 251, "total_data_size": 752630, "memory_usage": 761408, "flush_reason": "Manual Compaction"}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075908477935, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 463721, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25201, "largest_seqno": 25688, "table_properties": {"data_size": 461006, "index_size": 751, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7128, "raw_average_key_size": 20, "raw_value_size": 455457, "raw_average_value_size": 1305, "num_data_blocks": 30, "num_entries": 349, "num_filter_entries": 349, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769075893, "oldest_key_time": 1769075893, "file_creation_time": 1769075908, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 2034 microseconds, and 1327 cpu microseconds.
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.477959) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 463721 bytes OK
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.477969) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.478343) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.478354) EVENT_LOG_v1 {"time_micros": 1769075908478351, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.478365) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 749631, prev total WAL file size 749631, number of live WAL files 2.
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.478741) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353031' seq:72057594037927935, type:22 .. '6D67727374617400373533' seq:0, type:0; will stop at (end)
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(452KB)], [45(16MB)]
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075908478932, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 17513599, "oldest_snapshot_seqno": -1}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5628 keys, 13416558 bytes, temperature: kUnknown
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075908508890, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13416558, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13379548, "index_size": 21860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14085, "raw_key_size": 143287, "raw_average_key_size": 25, "raw_value_size": 13278203, "raw_average_value_size": 2359, "num_data_blocks": 890, "num_entries": 5628, "num_filter_entries": 5628, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769075908, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.509044) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13416558 bytes
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.510558) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 584.4 rd, 447.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 16.3 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(66.7) write-amplify(28.9) OK, records in: 6145, records dropped: 517 output_compression: NoCompression
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.510572) EVENT_LOG_v1 {"time_micros": 1769075908510566, "job": 26, "event": "compaction_finished", "compaction_time_micros": 29966, "compaction_time_cpu_micros": 20497, "output_level": 6, "num_output_files": 1, "total_output_size": 13416558, "num_input_records": 6145, "num_output_records": 5628, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075908510803, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075908512562, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.478638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.512662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.512666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.512668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.512669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:28 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:58:28.512671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:58:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:29.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:58:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:30.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:58:30 np0005591762 nova_compute[225313]: 2026-01-22 09:58:30.274 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:31.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:31 np0005591762 nova_compute[225313]: 2026-01-22 09:58:31.909 225317 INFO nova.compute.manager [None req-c6bb74af-b9b1-4e8d-8c46-4b6c3100bac0 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Get console output#033[00m
Jan 22 04:58:31 np0005591762 nova_compute[225313]: 2026-01-22 09:58:31.914 230487 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 04:58:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:32.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:32 np0005591762 nova_compute[225313]: 2026-01-22 09:58:32.983 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:34.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:35 np0005591762 nova_compute[225313]: 2026-01-22 09:58:35.277 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:35 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:35Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4b:e2:e8 10.100.0.9
Jan 22 04:58:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:36.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:58:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:37.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.500 225317 DEBUG nova.compute.manager [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-changed-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.500 225317 DEBUG nova.compute.manager [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Refreshing instance network info cache due to event network-changed-c8d82fa4-0662-4cff-a072-e825567a344e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.500 225317 DEBUG oslo_concurrency.lockutils [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.501 225317 DEBUG oslo_concurrency.lockutils [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.501 225317 DEBUG nova.network.neutron [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Refreshing network info cache for port c8d82fa4-0662-4cff-a072-e825567a344e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.504 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.505 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.505 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.505 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.505 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.506 225317 INFO nova.compute.manager [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Terminating instance#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.507 225317 DEBUG nova.compute.manager [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 04:58:37 np0005591762 kernel: tapc8d82fa4-06 (unregistering): left promiscuous mode
Jan 22 04:58:37 np0005591762 NetworkManager[48910]: <info>  [1769075917.5449] device (tapc8d82fa4-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 04:58:37 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:37Z|00073|binding|INFO|Releasing lport c8d82fa4-0662-4cff-a072-e825567a344e from this chassis (sb_readonly=0)
Jan 22 04:58:37 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:37Z|00074|binding|INFO|Setting lport c8d82fa4-0662-4cff-a072-e825567a344e down in Southbound
Jan 22 04:58:37 np0005591762 ovn_controller[133622]: 2026-01-22T09:58:37Z|00075|binding|INFO|Removing iface tapc8d82fa4-06 ovn-installed in OVS
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.553 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.559 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:e2:e8 10.100.0.9'], port_security=['fa:16:3e:4b:e2:e8 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3a47d151-726b-45ba-a05b-5370ac89942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4421835f-206a-4ba7-9834-b1f735d99d9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4fb1cf86-fd68-4d5b-bf1d-3427a3fd17ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=32398acc-7b78-499b-93d7-aab150acf0e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=c8d82fa4-0662-4cff-a072-e825567a344e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.560 143150 INFO neutron.agent.ovn.metadata.agent [-] Port c8d82fa4-0662-4cff-a072-e825567a344e in datapath 4421835f-206a-4ba7-9834-b1f735d99d9b unbound from our chassis#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.561 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4421835f-206a-4ba7-9834-b1f735d99d9b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.562 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[d4255602-6762-46f7-ab65-4206c83464c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.563 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b namespace which is not needed anymore#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.573 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 22 04:58:37 np0005591762 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 11.792s CPU time.
Jan 22 04:58:37 np0005591762 systemd-machined[193990]: Machine qemu-4-instance-0000000a terminated.
Jan 22 04:58:37 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [NOTICE]   (232311) : haproxy version is 2.8.14-c23fe91
Jan 22 04:58:37 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [NOTICE]   (232311) : path to executable is /usr/sbin/haproxy
Jan 22 04:58:37 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [WARNING]  (232311) : Exiting Master process...
Jan 22 04:58:37 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [WARNING]  (232311) : Exiting Master process...
Jan 22 04:58:37 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [ALERT]    (232311) : Current worker (232313) exited with code 143 (Terminated)
Jan 22 04:58:37 np0005591762 neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b[232307]: [WARNING]  (232311) : All workers exited. Exiting... (0)
Jan 22 04:58:37 np0005591762 systemd[1]: libpod-907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9.scope: Deactivated successfully.
Jan 22 04:58:37 np0005591762 podman[233025]: 2026-01-22 09:58:37.665571687 +0000 UTC m=+0.034095163 container died 907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 04:58:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:37 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9-userdata-shm.mount: Deactivated successfully.
Jan 22 04:58:37 np0005591762 systemd[1]: var-lib-containers-storage-overlay-3e4d08272d58c01d7a81fa8a399044c67e4a0f98d989dd4527cfad509c4717b7-merged.mount: Deactivated successfully.
Jan 22 04:58:37 np0005591762 podman[233025]: 2026-01-22 09:58:37.687957718 +0000 UTC m=+0.056481182 container cleanup 907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 04:58:37 np0005591762 systemd[1]: libpod-conmon-907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9.scope: Deactivated successfully.
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.721 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.724 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 podman[233048]: 2026-01-22 09:58:37.734706665 +0000 UTC m=+0.031903050 container remove 907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.734 225317 INFO nova.virt.libvirt.driver [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Instance destroyed successfully.#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.734 225317 DEBUG nova.objects.instance [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'resources' on Instance uuid 3a47d151-726b-45ba-a05b-5370ac89942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.740 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[704ecf65-68b9-4ee3-a790-6ea40fe0fead]: (4, ('Thu Jan 22 09:58:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b (907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9)\n907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9\nThu Jan 22 09:58:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b (907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9)\n907a5e574a688cee282a6c522750a1f244a1797c692650abec15641d974fbdf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.741 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[5328a2b1-7fc3-47da-a1ec-68d3f87b0cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.742 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4421835f-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:37 np0005591762 kernel: tap4421835f-20: left promiscuous mode
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.743 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.747 225317 DEBUG nova.virt.libvirt.vif [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:58:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2060623440',display_name='tempest-TestNetworkBasicOps-server-2060623440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2060623440',id=10,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXMmkM1dFZ2CyXD6dD2L52v4NoYUb6W+vn5jZeCwXvCRMkq0BkS6TaO0bPPnGuHZNi1RtK7TGfFqGIY4B7tQIv0qWMij9YX9X/riVenKrExJRQsOCH+fb6DIatGWuLzuw==',key_name='tempest-TestNetworkBasicOps-1280118144',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:58:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-d801a861',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:58:15Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=3a47d151-726b-45ba-a05b-5370ac89942a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.747 225317 DEBUG nova.network.os_vif_util [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.748 225317 DEBUG nova.network.os_vif_util [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.748 225317 DEBUG os_vif [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.750 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.751 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8d82fa4-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.753 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.756 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.760 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.761 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.762 225317 INFO os_vif [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:e2:e8,bridge_name='br-int',has_traffic_filtering=True,id=c8d82fa4-0662-4cff-a072-e825567a344e,network=Network(4421835f-206a-4ba7-9834-b1f735d99d9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d82fa4-06')#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.762 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[7c518d3b-701b-418a-9ef0-a64c4501be67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.772 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[1b80f02c-6986-478c-b3ed-08076518a116]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.774 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb2c2be-90a9-4289-8e8b-1f9cf200ece1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.787 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[81613c29-287c-4442-bed8-8f6e81a44726]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347116, 'reachable_time': 18378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233085, 'error': None, 'target': 'ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 systemd[1]: run-netns-ovnmeta\x2d4421835f\x2d206a\x2d4ba7\x2d9834\x2db1f735d99d9b.mount: Deactivated successfully.
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.791 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4421835f-206a-4ba7-9834-b1f735d99d9b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 04:58:37 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:37.791 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[52cf4cc6-b02e-468c-8393-ce7ea3be04d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.924 225317 INFO nova.virt.libvirt.driver [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Deleting instance files /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a_del#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.925 225317 INFO nova.virt.libvirt.driver [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Deletion of /var/lib/nova/instances/3a47d151-726b-45ba-a05b-5370ac89942a_del complete#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.969 225317 INFO nova.compute.manager [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.970 225317 DEBUG oslo.service.loopingcall [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.970 225317 DEBUG nova.compute.manager [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 04:58:37 np0005591762 nova_compute[225313]: 2026-01-22 09:58:37.970 225317 DEBUG nova.network.neutron [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 04:58:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:38.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.432 225317 DEBUG nova.network.neutron [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.443 225317 INFO nova.compute.manager [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Took 0.47 seconds to deallocate network for instance.#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.476 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.476 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.484 225317 DEBUG nova.compute.manager [req-96a27b5c-5687-41af-9d83-53420a2533a2 req-e0e368e1-f95b-4d0a-976c-1810e244ccec e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-vif-deleted-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.524 225317 DEBUG oslo_concurrency.processutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:58:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:58:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1758118082' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.870 225317 DEBUG oslo_concurrency.processutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.874 225317 DEBUG nova.compute.provider_tree [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.890 225317 DEBUG nova.scheduler.client.report [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.904 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.922 225317 INFO nova.scheduler.client.report [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Deleted allocations for instance 3a47d151-726b-45ba-a05b-5370ac89942a#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.953 225317 DEBUG nova.network.neutron [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updated VIF entry in instance network info cache for port c8d82fa4-0662-4cff-a072-e825567a344e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.954 225317 DEBUG nova.network.neutron [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Updating instance_info_cache with network_info: [{"id": "c8d82fa4-0662-4cff-a072-e825567a344e", "address": "fa:16:3e:4b:e2:e8", "network": {"id": "4421835f-206a-4ba7-9834-b1f735d99d9b", "bridge": "br-int", "label": "tempest-network-smoke--307498347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d82fa4-06", "ovs_interfaceid": "c8d82fa4-0662-4cff-a072-e825567a344e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.969 225317 DEBUG oslo_concurrency.lockutils [None req-0064e8c6-02ad-432a-841f-e3629bf2d86b 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:38 np0005591762 nova_compute[225313]: 2026-01-22 09:58:38.971 225317 DEBUG oslo_concurrency.lockutils [req-2d0efa1c-b97c-474d-a43c-1b9bff321a50 req-5f9c0fd5-68e3-4f6f-96a7-60515e69cf78 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-3a47d151-726b-45ba-a05b-5370ac89942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:58:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000021s ======
Jan 22 04:58:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:39.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000021s
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.574 225317 DEBUG nova.compute.manager [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-vif-unplugged-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.574 225317 DEBUG oslo_concurrency.lockutils [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.575 225317 DEBUG oslo_concurrency.lockutils [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.575 225317 DEBUG oslo_concurrency.lockutils [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.575 225317 DEBUG nova.compute.manager [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] No waiting events found dispatching network-vif-unplugged-c8d82fa4-0662-4cff-a072-e825567a344e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.575 225317 WARNING nova.compute.manager [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received unexpected event network-vif-unplugged-c8d82fa4-0662-4cff-a072-e825567a344e for instance with vm_state deleted and task_state None.#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.575 225317 DEBUG nova.compute.manager [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.576 225317 DEBUG oslo_concurrency.lockutils [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.576 225317 DEBUG oslo_concurrency.lockutils [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.576 225317 DEBUG oslo_concurrency.lockutils [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "3a47d151-726b-45ba-a05b-5370ac89942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.576 225317 DEBUG nova.compute.manager [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] No waiting events found dispatching network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:58:39 np0005591762 nova_compute[225313]: 2026-01-22 09:58:39.576 225317 WARNING nova.compute.manager [req-eace3b20-742e-4b84-b97f-e8f916e8b169 req-a0471e86-67ec-446a-b6a7-f68537ffd87a e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Received unexpected event network-vif-plugged-c8d82fa4-0662-4cff-a072-e825567a344e for instance with vm_state deleted and task_state None.#033[00m
Jan 22 04:58:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:39 np0005591762 podman[233115]: 2026-01-22 09:58:39.841020693 +0000 UTC m=+0.063854368 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:58:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:40.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:40 np0005591762 nova_compute[225313]: 2026-01-22 09:58:40.281 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:41.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:42.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:42 np0005591762 nova_compute[225313]: 2026-01-22 09:58:42.477 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:42 np0005591762 nova_compute[225313]: 2026-01-22 09:58:42.561 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:42 np0005591762 nova_compute[225313]: 2026-01-22 09:58:42.753 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:43.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:44.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:45.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:45 np0005591762 nova_compute[225313]: 2026-01-22 09:58:45.282 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:46.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:47.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:47.201 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:58:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:47.202 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:58:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:58:47.202 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:58:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:47 np0005591762 nova_compute[225313]: 2026-01-22 09:58:47.754 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:47 np0005591762 podman[233140]: 2026-01-22 09:58:47.832716011 +0000 UTC m=+0.055188646 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 04:58:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:48.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:49.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:58:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:50.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:58:50 np0005591762 nova_compute[225313]: 2026-01-22 09:58:50.284 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:51.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:52.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:52 np0005591762 nova_compute[225313]: 2026-01-22 09:58:52.728 225317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769075917.727608, 3a47d151-726b-45ba-a05b-5370ac89942a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:58:52 np0005591762 nova_compute[225313]: 2026-01-22 09:58:52.729 225317 INFO nova.compute.manager [-] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] VM Stopped (Lifecycle Event)#033[00m
Jan 22 04:58:52 np0005591762 nova_compute[225313]: 2026-01-22 09:58:52.750 225317 DEBUG nova.compute.manager [None req-48338036-c80d-475e-9b5d-a1e127866a1c - - - - - -] [instance: 3a47d151-726b-45ba-a05b-5370ac89942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:58:52 np0005591762 nova_compute[225313]: 2026-01-22 09:58:52.755 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:53.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:58:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:54.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:58:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:58:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:55.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:58:55 np0005591762 nova_compute[225313]: 2026-01-22 09:58:55.286 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:56.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:58:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:57.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:57 np0005591762 nova_compute[225313]: 2026-01-22 09:58:57.756 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:58:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:58:58.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:58:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:58:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:58:59.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:58:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:58:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:58:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:58:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:59:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:00.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:59:00 np0005591762 nova_compute[225313]: 2026-01-22 09:59:00.287 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:00 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:01.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:01 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:02.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:02 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:02 np0005591762 nova_compute[225313]: 2026-01-22 09:59:02.756 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:03.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:03 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:04.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:04 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.164 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.164 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.175 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 04:59:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:05.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.225 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.225 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.230 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.231 225317 INFO nova.compute.claims [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.289 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.300 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:59:05 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2812215266' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.642 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.645 225317 DEBUG nova.compute.provider_tree [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.657 225317 DEBUG nova.scheduler.client.report [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.670 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.671 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 04:59:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:05 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.706 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.707 225317 DEBUG nova.network.neutron [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.719 225317 INFO nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.731 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.844 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.848 225317 WARNING nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.848 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Triggering sync for uuid 615a2021-5fec-4c87-b900-a7adeee0822a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.848 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.897 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.898 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.898 225317 INFO nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Creating image(s)#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.916 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.932 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.949 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.951 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.997 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.998 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "9db187949728ea707722fd244d769f131efa8688" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.998 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:05 np0005591762 nova_compute[225313]: 2026-01-22 09:59:05.999 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.015 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.017 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 615a2021-5fec-4c87-b900-a7adeee0822a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.127 225317 DEBUG nova.policy [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4428dd9b0fb64c25b8f33b0050d4ef6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.152 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 615a2021-5fec-4c87-b900-a7adeee0822a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:59:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:06.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.195 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] resizing rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.245 225317 DEBUG nova.objects.instance [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'migration_context' on Instance uuid 615a2021-5fec-4c87-b900-a7adeee0822a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.255 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.255 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Ensure instance console log exists: /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.256 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.256 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.256 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:06 np0005591762 nova_compute[225313]: 2026-01-22 09:59:06.566 225317 DEBUG nova.network.neutron [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Successfully created port: 9b2f10a1-c537-4e0f-80d7-2208da62b14e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 04:59:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:06 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:07.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.287 225317 DEBUG nova.network.neutron [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Successfully updated port: 9b2f10a1-c537-4e0f-80d7-2208da62b14e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.298 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.298 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.298 225317 DEBUG nova.network.neutron [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.360 225317 DEBUG nova.compute.manager [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-changed-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.360 225317 DEBUG nova.compute.manager [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Refreshing instance network info cache due to event network-changed-9b2f10a1-c537-4e0f-80d7-2208da62b14e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.360 225317 DEBUG oslo_concurrency.lockutils [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.406 225317 DEBUG nova.network.neutron [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 04:59:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:07 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.757 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.837 225317 DEBUG nova.network.neutron [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updating instance_info_cache with network_info: [{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.852 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.852 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Instance network_info: |[{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.852 225317 DEBUG oslo_concurrency.lockutils [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.852 225317 DEBUG nova.network.neutron [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Refreshing network info cache for port 9b2f10a1-c537-4e0f-80d7-2208da62b14e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.854 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Start _get_guest_xml network_info=[{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'encryption_options': None, 'image_id': 'bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.857 225317 WARNING nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.861 225317 DEBUG nova.virt.libvirt.host [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.861 225317 DEBUG nova.virt.libvirt.host [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.865 225317 DEBUG nova.virt.libvirt.host [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.866 225317 DEBUG nova.virt.libvirt.host [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.866 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.866 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T09:51:32Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6eff66ba-fb3e-4ca7-b05b-920b01d9affd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.867 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.867 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.867 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.867 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.867 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.868 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.868 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.868 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.868 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.868 225317 DEBUG nova.virt.hardware [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 04:59:07 np0005591762 nova_compute[225313]: 2026-01-22 09:59:07.870 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:08.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3230951590' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.212 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.229 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.231 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3843890131' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.567 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.569 225317 DEBUG nova.virt.libvirt.vif [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:59:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2101191055',display_name='tempest-TestNetworkBasicOps-server-2101191055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2101191055',id=12,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHDd3tLK0qOTOnabOoT/zr/8CcHtf+fiioIVMVqhPuHyVoeFz+rjYDj7N8df/E3fgkEQ1WBcMMjyDDNfK4VupQ+KYkGfkVd1ELcvrCa7w1n3HeiNTNAirUYOrzn6gDul7w==',key_name='tempest-TestNetworkBasicOps-1994412307',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-wyrk9iba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:59:05Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=615a2021-5fec-4c87-b900-a7adeee0822a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.569 225317 DEBUG nova.network.os_vif_util [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.570 225317 DEBUG nova.network.os_vif_util [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.571 225317 DEBUG nova.objects.instance [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 615a2021-5fec-4c87-b900-a7adeee0822a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.584 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <uuid>615a2021-5fec-4c87-b900-a7adeee0822a</uuid>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <name>instance-0000000c</name>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <memory>131072</memory>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <vcpu>1</vcpu>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:name>tempest-TestNetworkBasicOps-server-2101191055</nova:name>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:creationTime>2026-01-22 09:59:07</nova:creationTime>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:flavor name="m1.nano">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:memory>128</nova:memory>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:disk>1</nova:disk>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:swap>0</nova:swap>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:vcpus>1</nova:vcpus>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </nova:flavor>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:owner>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </nova:owner>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <nova:ports>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <nova:port uuid="9b2f10a1-c537-4e0f-80d7-2208da62b14e">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        </nova:port>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </nova:ports>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </nova:instance>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <sysinfo type="smbios">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <system>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <entry name="manufacturer">RDO</entry>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <entry name="product">OpenStack Compute</entry>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <entry name="serial">615a2021-5fec-4c87-b900-a7adeee0822a</entry>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <entry name="uuid">615a2021-5fec-4c87-b900-a7adeee0822a</entry>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <entry name="family">Virtual Machine</entry>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </system>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <os>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <boot dev="hd"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <smbios mode="sysinfo"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </os>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <features>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <vmcoreinfo/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </features>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <clock offset="utc">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <timer name="hpet" present="no"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </clock>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <cpu mode="host-model" match="exact">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  <devices>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <disk type="network" device="disk">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/615a2021-5fec-4c87-b900-a7adeee0822a_disk">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <target dev="vda" bus="virtio"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <disk type="network" device="cdrom">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/615a2021-5fec-4c87-b900-a7adeee0822a_disk.config">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </source>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      </auth>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <target dev="sda" bus="sata"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </disk>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <interface type="ethernet">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <mac address="fa:16:3e:ad:af:b5"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <mtu size="1442"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <target dev="tap9b2f10a1-c5"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </interface>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <serial type="pty">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <log file="/var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/console.log" append="off"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </serial>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <video>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </video>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <input type="tablet" bus="usb"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <rng model="virtio">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <backend model="random">/dev/urandom</backend>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </rng>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <controller type="usb" index="0"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    <memballoon model="virtio">
Jan 22 04:59:08 np0005591762 nova_compute[225313]:      <stats period="10"/>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 04:59:08 np0005591762 nova_compute[225313]:  </devices>
Jan 22 04:59:08 np0005591762 nova_compute[225313]: </domain>
Jan 22 04:59:08 np0005591762 nova_compute[225313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.585 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Preparing to wait for external event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.586 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.586 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.586 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.587 225317 DEBUG nova.virt.libvirt.vif [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:59:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2101191055',display_name='tempest-TestNetworkBasicOps-server-2101191055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2101191055',id=12,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHDd3tLK0qOTOnabOoT/zr/8CcHtf+fiioIVMVqhPuHyVoeFz+rjYDj7N8df/E3fgkEQ1WBcMMjyDDNfK4VupQ+KYkGfkVd1ELcvrCa7w1n3HeiNTNAirUYOrzn6gDul7w==',key_name='tempest-TestNetworkBasicOps-1994412307',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-wyrk9iba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:59:05Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=615a2021-5fec-4c87-b900-a7adeee0822a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.587 225317 DEBUG nova.network.os_vif_util [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.587 225317 DEBUG nova.network.os_vif_util [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.588 225317 DEBUG os_vif [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.588 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.588 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.589 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.591 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.591 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b2f10a1-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.591 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b2f10a1-c5, col_values=(('external_ids', {'iface-id': '9b2f10a1-c537-4e0f-80d7-2208da62b14e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:af:b5', 'vm-uuid': '615a2021-5fec-4c87-b900-a7adeee0822a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.593 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:08 np0005591762 NetworkManager[48910]: <info>  [1769075948.5939] manager: (tap9b2f10a1-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.595 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.598 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.598 225317 INFO os_vif [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5')#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.633 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.634 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.634 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:ad:af:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.634 225317 INFO nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Using config drive#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.650 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:08 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.698059) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075948698169, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 632, "num_deletes": 251, "total_data_size": 1080487, "memory_usage": 1098704, "flush_reason": "Manual Compaction"}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075948701064, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 710040, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25693, "largest_seqno": 26320, "table_properties": {"data_size": 706989, "index_size": 1023, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7319, "raw_average_key_size": 19, "raw_value_size": 700803, "raw_average_value_size": 1825, "num_data_blocks": 46, "num_entries": 384, "num_filter_entries": 384, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769075909, "oldest_key_time": 1769075909, "file_creation_time": 1769075948, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 2974 microseconds, and 2106 cpu microseconds.
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.701093) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 710040 bytes OK
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.701106) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.701617) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.701637) EVENT_LOG_v1 {"time_micros": 1769075948701634, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.701650) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1076988, prev total WAL file size 1076988, number of live WAL files 2.
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.702027) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(693KB)], [48(12MB)]
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075948702070, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14126598, "oldest_snapshot_seqno": -1}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5501 keys, 12051692 bytes, temperature: kUnknown
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075948728915, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12051692, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12016670, "index_size": 20192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 141328, "raw_average_key_size": 25, "raw_value_size": 11918692, "raw_average_value_size": 2166, "num_data_blocks": 815, "num_entries": 5501, "num_filter_entries": 5501, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769075948, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.729047) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12051692 bytes
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.730356) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 525.6 rd, 448.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.8 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(36.9) write-amplify(17.0) OK, records in: 6012, records dropped: 511 output_compression: NoCompression
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.730370) EVENT_LOG_v1 {"time_micros": 1769075948730364, "job": 28, "event": "compaction_finished", "compaction_time_micros": 26879, "compaction_time_cpu_micros": 18346, "output_level": 6, "num_output_files": 1, "total_output_size": 12051692, "num_input_records": 6012, "num_output_records": 5501, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075948730538, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769075948732223, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.701960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.732248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.732251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.732253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.732254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:59:08 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-09:59:08.732255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.790 225317 DEBUG nova.network.neutron [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updated VIF entry in instance network info cache for port 9b2f10a1-c537-4e0f-80d7-2208da62b14e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.790 225317 DEBUG nova.network.neutron [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updating instance_info_cache with network_info: [{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.801 225317 DEBUG oslo_concurrency.lockutils [req-8c87b8c6-9f87-4a08-afb4-915fb16c72d6 req-6a5bb352-3b30-4293-b60d-cb2b09f54987 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.866 225317 INFO nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Creating config drive at /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/disk.config#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.870 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg0ppct9e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:08 np0005591762 nova_compute[225313]: 2026-01-22 09:59:08.988 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg0ppct9e" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.006 225317 DEBUG nova.storage.rbd_utils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 615a2021-5fec-4c87-b900-a7adeee0822a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.008 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/disk.config 615a2021-5fec-4c87-b900-a7adeee0822a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.090 225317 DEBUG oslo_concurrency.processutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/disk.config 615a2021-5fec-4c87-b900-a7adeee0822a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.090 225317 INFO nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Deleting local config drive /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a/disk.config because it was imported into RBD.#033[00m
Jan 22 04:59:09 np0005591762 kernel: tap9b2f10a1-c5: entered promiscuous mode
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.1246] manager: (tap9b2f10a1-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.126 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.129 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:09Z|00076|binding|INFO|Claiming lport 9b2f10a1-c537-4e0f-80d7-2208da62b14e for this chassis.
Jan 22 04:59:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:09Z|00077|binding|INFO|9b2f10a1-c537-4e0f-80d7-2208da62b14e: Claiming fa:16:3e:ad:af:b5 10.100.0.5
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.133 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.1338] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.1343] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.137 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:af:b5 10.100.0.5'], port_security=['fa:16:3e:ad:af:b5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '615a2021-5fec-4c87-b900-a7adeee0822a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '54f87800-fb58-4b91-a268-998c238e132d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d292be6-42d2-4eb4-af2b-ecad29bc80e3, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=9b2f10a1-c537-4e0f-80d7-2208da62b14e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.138 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 9b2f10a1-c537-4e0f-80d7-2208da62b14e in datapath 0cca0693-e180-41bd-85c4-2ab5918f9a75 bound to our chassis#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.142 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0cca0693-e180-41bd-85c4-2ab5918f9a75#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.150 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[696fbc81-8e03-4efa-907f-9128a13190ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.150 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0cca0693-e1 in ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 04:59:09 np0005591762 systemd-udevd[233534]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.152 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0cca0693-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.152 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba80f6d-5d08-4386-a16c-4f608cd13644]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.152 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[3dab40df-5890-4a35-8b87-ed71a7d420e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.162 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[40d4249b-de66-498d-b547-aa3978d00644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 systemd-machined[193990]: New machine qemu-5-instance-0000000c.
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.1652] device (tap9b2f10a1-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.1658] device (tap9b2f10a1-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 04:59:09 np0005591762 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.184 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0c5d9d-9a93-4220-814b-27aa5ed20e46]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:09.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.206 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[6e68fbdc-97ca-4ec3-be42-9d82c6764edc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.209 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[b8895723-43de-4dae-8f96-2d1a57f4ee98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 systemd-udevd[233538]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.2111] manager: (tap0cca0693-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.224 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.229 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.236 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.245 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[7143581f-9097-4418-ae69-892291f77d80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.247 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[11806425-8a10-4b02-a632-ec3dc9fe8aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:09Z|00078|binding|INFO|Setting lport 9b2f10a1-c537-4e0f-80d7-2208da62b14e ovn-installed in OVS
Jan 22 04:59:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:09Z|00079|binding|INFO|Setting lport 9b2f10a1-c537-4e0f-80d7-2208da62b14e up in Southbound
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.251 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.2642] device (tap0cca0693-e0): carrier: link connected
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.268 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[cb66f51f-21c7-4807-9af1-f5a4d96ea330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.281 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0ec902-d9af-4d6a-abfb-d7b29b98c938]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0cca0693-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:3e:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352682, 'reachable_time': 18376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233559, 'error': None, 'target': 'ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.293 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[ade65c58-9ec0-4dd1-aeb9-443d507c58f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:3eb2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 352682, 'tstamp': 352682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233560, 'error': None, 'target': 'ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.305 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[138651d2-bce6-45e2-a1f2-aebe96bcecbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0cca0693-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:3e:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352682, 'reachable_time': 18376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233561, 'error': None, 'target': 'ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.327 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6a5c17-4eb5-483e-b8d0-b0398c99712f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.366 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c76d3f-ca01-4d24-983d-a49d3d79336c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.366 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0cca0693-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.367 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.367 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0cca0693-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:09 np0005591762 NetworkManager[48910]: <info>  [1769075949.3692] manager: (tap0cca0693-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 22 04:59:09 np0005591762 kernel: tap0cca0693-e0: entered promiscuous mode
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.368 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.371 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0cca0693-e0, col_values=(('external_ids', {'iface-id': 'cd9de746-25ad-4245-895c-bca4e97ef227'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:09 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:09Z|00080|binding|INFO|Releasing lport cd9de746-25ad-4245-895c-bca4e97ef227 from this chassis (sb_readonly=0)
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.387 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0cca0693-e180-41bd-85c4-2ab5918f9a75.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0cca0693-e180-41bd-85c4-2ab5918f9a75.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.387 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.390 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[bf64a898-6b3f-4e12-b284-9fa978e05732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.391 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-0cca0693-e180-41bd-85c4-2ab5918f9a75
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/0cca0693-e180-41bd-85c4-2ab5918f9a75.pid.haproxy
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID 0cca0693-e180-41bd-85c4-2ab5918f9a75
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 04:59:09 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:09.393 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'env', 'PROCESS_TAG=haproxy-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0cca0693-e180-41bd-85c4-2ab5918f9a75.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 04:59:09 np0005591762 podman[233588]: 2026-01-22 09:59:09.671208514 +0000 UTC m=+0.031480723 container create 7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 04:59:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:09 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:09 np0005591762 systemd[1]: Started libpod-conmon-7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43.scope.
Jan 22 04:59:09 np0005591762 systemd[1]: Started libcrun container.
Jan 22 04:59:09 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f15698cb8dd4601972dd2f649da41062c9f4889c3883ed30e93a8f90facd51a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 04:59:09 np0005591762 podman[233588]: 2026-01-22 09:59:09.728986543 +0000 UTC m=+0.089258772 container init 7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 04:59:09 np0005591762 podman[233588]: 2026-01-22 09:59:09.733092706 +0000 UTC m=+0.093364916 container start 7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 04:59:09 np0005591762 podman[233588]: 2026-01-22 09:59:09.657778085 +0000 UTC m=+0.018050294 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.742 225317 DEBUG nova.compute.manager [req-50d01bd2-9a6c-4ae8-a0c1-f033d4d863ac req-ac5365f8-9b7e-4297-8980-836bdf8be701 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.743 225317 DEBUG oslo_concurrency.lockutils [req-50d01bd2-9a6c-4ae8-a0c1-f033d4d863ac req-ac5365f8-9b7e-4297-8980-836bdf8be701 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.743 225317 DEBUG oslo_concurrency.lockutils [req-50d01bd2-9a6c-4ae8-a0c1-f033d4d863ac req-ac5365f8-9b7e-4297-8980-836bdf8be701 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.743 225317 DEBUG oslo_concurrency.lockutils [req-50d01bd2-9a6c-4ae8-a0c1-f033d4d863ac req-ac5365f8-9b7e-4297-8980-836bdf8be701 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:09 np0005591762 nova_compute[225313]: 2026-01-22 09:59:09.744 225317 DEBUG nova.compute.manager [req-50d01bd2-9a6c-4ae8-a0c1-f033d4d863ac req-ac5365f8-9b7e-4297-8980-836bdf8be701 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Processing event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 04:59:09 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [NOTICE]   (233604) : New worker (233606) forked
Jan 22 04:59:09 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [NOTICE]   (233604) : Loading success.
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.153 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.154 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075950.1531591, 615a2021-5fec-4c87-b900-a7adeee0822a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.154 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] VM Started (Lifecycle Event)#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.157 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.160 225317 INFO nova.virt.libvirt.driver [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Instance spawned successfully.#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.160 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.172 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.176 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.179 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.179 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.179 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.180 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.180 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.181 225317 DEBUG nova.virt.libvirt.driver [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 04:59:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:10.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.195 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.196 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075950.1532478, 615a2021-5fec-4c87-b900-a7adeee0822a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.196 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] VM Paused (Lifecycle Event)#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.213 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.215 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769075950.1574914, 615a2021-5fec-4c87-b900-a7adeee0822a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.215 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] VM Resumed (Lifecycle Event)#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.221 225317 INFO nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Took 4.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.221 225317 DEBUG nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.227 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.228 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.247 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.267 225317 INFO nova.compute.manager [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Took 5.06 seconds to build instance.#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.275 225317 DEBUG oslo_concurrency.lockutils [None req-79aa234b-64b4-4d23-b677-4c4ad1081a35 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.276 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "615a2021-5fec-4c87-b900-a7adeee0822a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 4.427s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.276 225317 INFO nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] During sync_power_state the instance has a pending task (networking). Skip.#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.276 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "615a2021-5fec-4c87-b900-a7adeee0822a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:10 np0005591762 nova_compute[225313]: 2026-01-22 09:59:10.291 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:10 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:10 np0005591762 podman[233654]: 2026-01-22 09:59:10.843637263 +0000 UTC m=+0.065182173 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 04:59:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:11.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:11 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:11 np0005591762 nova_compute[225313]: 2026-01-22 09:59:11.807 225317 DEBUG nova.compute.manager [req-51a1af28-4197-44a1-adcc-0d3db78fcae6 req-f78d2e6e-cb0d-475f-9690-a9f5cb4d57c0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:11 np0005591762 nova_compute[225313]: 2026-01-22 09:59:11.807 225317 DEBUG oslo_concurrency.lockutils [req-51a1af28-4197-44a1-adcc-0d3db78fcae6 req-f78d2e6e-cb0d-475f-9690-a9f5cb4d57c0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:11 np0005591762 nova_compute[225313]: 2026-01-22 09:59:11.807 225317 DEBUG oslo_concurrency.lockutils [req-51a1af28-4197-44a1-adcc-0d3db78fcae6 req-f78d2e6e-cb0d-475f-9690-a9f5cb4d57c0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:11 np0005591762 nova_compute[225313]: 2026-01-22 09:59:11.807 225317 DEBUG oslo_concurrency.lockutils [req-51a1af28-4197-44a1-adcc-0d3db78fcae6 req-f78d2e6e-cb0d-475f-9690-a9f5cb4d57c0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:11 np0005591762 nova_compute[225313]: 2026-01-22 09:59:11.808 225317 DEBUG nova.compute.manager [req-51a1af28-4197-44a1-adcc-0d3db78fcae6 req-f78d2e6e-cb0d-475f-9690-a9f5cb4d57c0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] No waiting events found dispatching network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:59:11 np0005591762 nova_compute[225313]: 2026-01-22 09:59:11.808 225317 WARNING nova.compute.manager [req-51a1af28-4197-44a1-adcc-0d3db78fcae6 req-f78d2e6e-cb0d-475f-9690-a9f5cb4d57c0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received unexpected event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e for instance with vm_state active and task_state None.#033[00m
Jan 22 04:59:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:12.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:12 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:13.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:13 np0005591762 nova_compute[225313]: 2026-01-22 09:59:13.593 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:13 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:13 np0005591762 nova_compute[225313]: 2026-01-22 09:59:13.835 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:14.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:14 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.855 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.856 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquired lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.856 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 04:59:14 np0005591762 nova_compute[225313]: 2026-01-22 09:59:14.856 225317 DEBUG nova.objects.instance [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 615a2021-5fec-4c87-b900-a7adeee0822a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:59:15 np0005591762 nova_compute[225313]: 2026-01-22 09:59:15.179 225317 DEBUG nova.compute.manager [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-changed-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:15 np0005591762 nova_compute[225313]: 2026-01-22 09:59:15.180 225317 DEBUG nova.compute.manager [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Refreshing instance network info cache due to event network-changed-9b2f10a1-c537-4e0f-80d7-2208da62b14e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:59:15 np0005591762 nova_compute[225313]: 2026-01-22 09:59:15.180 225317 DEBUG oslo_concurrency.lockutils [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:15 np0005591762 nova_compute[225313]: 2026-01-22 09:59:15.293 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:15 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:16.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:16 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.061 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updating instance_info_cache with network_info: [{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.074 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Releasing lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.074 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.075 225317 DEBUG oslo_concurrency.lockutils [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.075 225317 DEBUG nova.network.neutron [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Refreshing network info cache for port 9b2f10a1-c537-4e0f-80d7-2208da62b14e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.076 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.077 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.077 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.077 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.077 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 04:59:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:17.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:17 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:17 np0005591762 nova_compute[225313]: 2026-01-22 09:59:17.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:18.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:18 np0005591762 nova_compute[225313]: 2026-01-22 09:59:18.402 225317 DEBUG nova.network.neutron [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updated VIF entry in instance network info cache for port 9b2f10a1-c537-4e0f-80d7-2208da62b14e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:59:18 np0005591762 nova_compute[225313]: 2026-01-22 09:59:18.403 225317 DEBUG nova.network.neutron [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updating instance_info_cache with network_info: [{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:59:18 np0005591762 nova_compute[225313]: 2026-01-22 09:59:18.413 225317 DEBUG oslo_concurrency.lockutils [req-86caf993-dc21-48d3-ae0a-7da6be193528 req-b4486ba6-e775-416c-b245-cbc2353e2bc0 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:59:18 np0005591762 nova_compute[225313]: 2026-01-22 09:59:18.596 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:18 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:18 np0005591762 nova_compute[225313]: 2026-01-22 09:59:18.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:18 np0005591762 podman[233705]: 2026-01-22 09:59:18.841858786 +0000 UTC m=+0.059702742 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 04:59:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:19 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:20.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:20 np0005591762 nova_compute[225313]: 2026-01-22 09:59:20.295 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:20 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:20Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:af:b5 10.100.0.5
Jan 22 04:59:20 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:20Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:af:b5 10.100.0.5
Jan 22 04:59:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:21.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:21 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:21 np0005591762 nova_compute[225313]: 2026-01-22 09:59:21.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 04:59:21 np0005591762 nova_compute[225313]: 2026-01-22 09:59:21.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:21 np0005591762 nova_compute[225313]: 2026-01-22 09:59:21.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:21 np0005591762 nova_compute[225313]: 2026-01-22 09:59:21.740 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:21 np0005591762 nova_compute[225313]: 2026-01-22 09:59:21.740 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 04:59:21 np0005591762 nova_compute[225313]: 2026-01-22 09:59:21.740 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:59:22 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/500229089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.079 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.339s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.121 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.121 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 04:59:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:22.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.313 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.314 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4687MB free_disk=59.89809799194336GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.314 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.315 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.372 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Instance 615a2021-5fec-4c87-b900-a7adeee0822a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.373 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.373 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.409 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:22 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.749 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.340s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.753 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.766 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.788 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 04:59:22 np0005591762 nova_compute[225313]: 2026-01-22 09:59:22.789 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:23.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:23 np0005591762 nova_compute[225313]: 2026-01-22 09:59:23.599 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:23 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:59:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:24.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:59:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:24 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:25.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:25 np0005591762 nova_compute[225313]: 2026-01-22 09:59:25.296 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:25 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:26.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:26 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 04:59:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:59:26 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:59:26 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 04:59:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:27.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:27 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:59:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:28.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:59:28 np0005591762 nova_compute[225313]: 2026-01-22 09:59:28.601 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:28 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:29.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:29 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:59:29 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 04:59:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:30.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:30 np0005591762 nova_compute[225313]: 2026-01-22 09:59:30.297 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:30 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:31.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:31 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:32.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:32 np0005591762 nova_compute[225313]: 2026-01-22 09:59:32.257 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:32.257 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:59:32 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:32.258 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 04:59:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:32 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:33.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:33 np0005591762 nova_compute[225313]: 2026-01-22 09:59:33.603 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:33 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:34.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:34 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:35.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:35 np0005591762 nova_compute[225313]: 2026-01-22 09:59:35.299 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:35 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.273 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.274 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.274 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.274 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.274 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.276 225317 INFO nova.compute.manager [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Terminating instance#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.276 225317 DEBUG nova.compute.manager [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 04:59:36 np0005591762 kernel: tap9b2f10a1-c5 (unregistering): left promiscuous mode
Jan 22 04:59:36 np0005591762 NetworkManager[48910]: <info>  [1769075976.3089] device (tap9b2f10a1-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 04:59:36 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:36Z|00081|binding|INFO|Releasing lport 9b2f10a1-c537-4e0f-80d7-2208da62b14e from this chassis (sb_readonly=0)
Jan 22 04:59:36 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:36Z|00082|binding|INFO|Setting lport 9b2f10a1-c537-4e0f-80d7-2208da62b14e down in Southbound
Jan 22 04:59:36 np0005591762 ovn_controller[133622]: 2026-01-22T09:59:36Z|00083|binding|INFO|Removing iface tap9b2f10a1-c5 ovn-installed in OVS
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.315 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.317 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.324 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:af:b5 10.100.0.5'], port_security=['fa:16:3e:ad:af:b5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '615a2021-5fec-4c87-b900-a7adeee0822a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54f87800-fb58-4b91-a268-998c238e132d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d292be6-42d2-4eb4-af2b-ecad29bc80e3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=9b2f10a1-c537-4e0f-80d7-2208da62b14e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.325 143150 INFO neutron.agent.ovn.metadata.agent [-] Port 9b2f10a1-c537-4e0f-80d7-2208da62b14e in datapath 0cca0693-e180-41bd-85c4-2ab5918f9a75 unbound from our chassis#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.326 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0cca0693-e180-41bd-85c4-2ab5918f9a75, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.326 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[ececbdf4-9a47-4aa1-a7e4-add5417c4663]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.327 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75 namespace which is not needed anymore#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.332 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 22 04:59:36 np0005591762 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 12.016s CPU time.
Jan 22 04:59:36 np0005591762 systemd-machined[193990]: Machine qemu-5-instance-0000000c terminated.
Jan 22 04:59:36 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [NOTICE]   (233604) : haproxy version is 2.8.14-c23fe91
Jan 22 04:59:36 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [NOTICE]   (233604) : path to executable is /usr/sbin/haproxy
Jan 22 04:59:36 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [WARNING]  (233604) : Exiting Master process...
Jan 22 04:59:36 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [WARNING]  (233604) : Exiting Master process...
Jan 22 04:59:36 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [ALERT]    (233604) : Current worker (233606) exited with code 143 (Terminated)
Jan 22 04:59:36 np0005591762 neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75[233600]: [WARNING]  (233604) : All workers exited. Exiting... (0)
Jan 22 04:59:36 np0005591762 systemd[1]: libpod-7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43.scope: Deactivated successfully.
Jan 22 04:59:36 np0005591762 podman[233941]: 2026-01-22 09:59:36.42460988 +0000 UTC m=+0.033867345 container died 7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 04:59:36 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43-userdata-shm.mount: Deactivated successfully.
Jan 22 04:59:36 np0005591762 systemd[1]: var-lib-containers-storage-overlay-0f15698cb8dd4601972dd2f649da41062c9f4889c3883ed30e93a8f90facd51a-merged.mount: Deactivated successfully.
Jan 22 04:59:36 np0005591762 podman[233941]: 2026-01-22 09:59:36.441983141 +0000 UTC m=+0.051240605 container cleanup 7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 04:59:36 np0005591762 systemd[1]: libpod-conmon-7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43.scope: Deactivated successfully.
Jan 22 04:59:36 np0005591762 podman[233964]: 2026-01-22 09:59:36.480119485 +0000 UTC m=+0.024505350 container remove 7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.487 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[8a23a767-f7b6-405d-8150-c1cda73d2c96]: (4, ('Thu Jan 22 09:59:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75 (7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43)\n7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43\nThu Jan 22 09:59:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75 (7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43)\n7e16ec60a78eae0c4dbc4a5dc62439016e2e09170a3cbebac52d7e34ba724a43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.489 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.490 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9bd396-1ba6-4f57-8774-fe0004f95ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.491 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0cca0693-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.493 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.497 225317 INFO nova.virt.libvirt.driver [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Instance destroyed successfully.#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.497 225317 DEBUG nova.objects.instance [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'resources' on Instance uuid 615a2021-5fec-4c87-b900-a7adeee0822a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.507 225317 DEBUG nova.virt.libvirt.vif [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:59:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2101191055',display_name='tempest-TestNetworkBasicOps-server-2101191055',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2101191055',id=12,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHDd3tLK0qOTOnabOoT/zr/8CcHtf+fiioIVMVqhPuHyVoeFz+rjYDj7N8df/E3fgkEQ1WBcMMjyDDNfK4VupQ+KYkGfkVd1ELcvrCa7w1n3HeiNTNAirUYOrzn6gDul7w==',key_name='tempest-TestNetworkBasicOps-1994412307',keypairs=<?>,launch_index=0,launched_at=2026-01-22T09:59:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-wyrk9iba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T09:59:10Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=615a2021-5fec-4c87-b900-a7adeee0822a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.507 225317 DEBUG nova.network.os_vif_util [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.508 225317 DEBUG nova.network.os_vif_util [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 04:59:36 np0005591762 kernel: tap0cca0693-e0: left promiscuous mode
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.508 225317 DEBUG os_vif [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.511 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.511 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b2f10a1-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.512 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.513 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.516 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[6afc1b01-a940-434d-8a75-478c82c57def]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.518 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.521 225317 INFO os_vif [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:af:b5,bridge_name='br-int',has_traffic_filtering=True,id=9b2f10a1-c537-4e0f-80d7-2208da62b14e,network=Network(0cca0693-e180-41bd-85c4-2ab5918f9a75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2f10a1-c5')#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.525 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[70685608-f5eb-47f3-9d5c-3930958f815a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.526 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2ad5b7-b174-4b14-8f63-8634c23ed22a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.538 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e1ff91-3262-4f3f-a5ab-0f039a1d4273]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352675, 'reachable_time': 26103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233998, 'error': None, 'target': 'ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.541 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0cca0693-e180-41bd-85c4-2ab5918f9a75 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 04:59:36 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:36.541 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[a7550596-a36e-4a4c-9b4b-658796024ede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 04:59:36 np0005591762 systemd[1]: run-netns-ovnmeta\x2d0cca0693\x2de180\x2d41bd\x2d85c4\x2d2ab5918f9a75.mount: Deactivated successfully.
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.679 225317 INFO nova.virt.libvirt.driver [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Deleting instance files /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a_del#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.680 225317 INFO nova.virt.libvirt.driver [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Deletion of /var/lib/nova/instances/615a2021-5fec-4c87-b900-a7adeee0822a_del complete#033[00m
Jan 22 04:59:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:36 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.693 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-changed-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.693 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Refreshing instance network info cache due to event network-changed-9b2f10a1-c537-4e0f-80d7-2208da62b14e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.694 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.694 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.694 225317 DEBUG nova.network.neutron [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Refreshing network info cache for port 9b2f10a1-c537-4e0f-80d7-2208da62b14e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.722 225317 INFO nova.compute.manager [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.722 225317 DEBUG oslo.service.loopingcall [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.723 225317 DEBUG nova.compute.manager [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 04:59:36 np0005591762 nova_compute[225313]: 2026-01-22 09:59:36.723 225317 DEBUG nova.network.neutron [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 04:59:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:37.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:37 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.174 225317 DEBUG nova.network.neutron [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.189 225317 INFO nova.compute.manager [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Took 1.47 seconds to deallocate network for instance.#033[00m
Jan 22 04:59:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:38.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.222 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.223 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.242 225317 DEBUG nova.compute.manager [req-8d86aaa1-28a8-4fd6-a3ed-647871486e9f req-1398f035-0c84-4f02-ba9c-8d2dbed12419 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-vif-deleted-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:38 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:38.260 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.292 225317 DEBUG oslo_concurrency.processutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.512 225317 DEBUG nova.network.neutron [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updated VIF entry in instance network info cache for port 9b2f10a1-c537-4e0f-80d7-2208da62b14e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.513 225317 DEBUG nova.network.neutron [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Updating instance_info_cache with network_info: [{"id": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "address": "fa:16:3e:ad:af:b5", "network": {"id": "0cca0693-e180-41bd-85c4-2ab5918f9a75", "bridge": "br-int", "label": "tempest-network-smoke--586807094", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2f10a1-c5", "ovs_interfaceid": "9b2f10a1-c537-4e0f-80d7-2208da62b14e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.527 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-615a2021-5fec-4c87-b900-a7adeee0822a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.528 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-vif-unplugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.528 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.528 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.529 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.529 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] No waiting events found dispatching network-vif-unplugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.529 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-vif-unplugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.529 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.529 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.530 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.530 225317 DEBUG oslo_concurrency.lockutils [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.530 225317 DEBUG nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] No waiting events found dispatching network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.530 225317 WARNING nova.compute.manager [req-6cdcfc31-7ec8-413f-8ddf-2d973afdf857 req-54c052c4-8fea-4ba2-aaf4-5c54ffcad3b8 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Received unexpected event network-vif-plugged-9b2f10a1-c537-4e0f-80d7-2208da62b14e for instance with vm_state active and task_state deleting.#033[00m
Jan 22 04:59:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:59:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2784702328' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.634 225317 DEBUG oslo_concurrency.processutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.637 225317 DEBUG nova.compute.provider_tree [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.654 225317 DEBUG nova.scheduler.client.report [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.673 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:38 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.692 225317 INFO nova.scheduler.client.report [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Deleted allocations for instance 615a2021-5fec-4c87-b900-a7adeee0822a#033[00m
Jan 22 04:59:38 np0005591762 nova_compute[225313]: 2026-01-22 09:59:38.736 225317 DEBUG oslo_concurrency.lockutils [None req-6b150aeb-4870-45ef-b018-68cd8b1c9e0e 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "615a2021-5fec-4c87-b900-a7adeee0822a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:39.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:39 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:40.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:40 np0005591762 nova_compute[225313]: 2026-01-22 09:59:40.301 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:40 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:41.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:41 np0005591762 nova_compute[225313]: 2026-01-22 09:59:41.512 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:41 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:41 np0005591762 podman[234038]: 2026-01-22 09:59:41.820683446 +0000 UTC m=+0.044067219 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 04:59:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:42.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:42 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:43.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:43 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:44.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:44 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:45.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:45 np0005591762 nova_compute[225313]: 2026-01-22 09:59:45.302 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:45 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:45 np0005591762 nova_compute[225313]: 2026-01-22 09:59:45.767 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:45 np0005591762 nova_compute[225313]: 2026-01-22 09:59:45.852 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:46.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:46 np0005591762 nova_compute[225313]: 2026-01-22 09:59:46.513 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:46 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:47.202 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:47.203 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 09:59:47.203 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:47.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:47 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:48.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:48 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:49 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:49 np0005591762 podman[234063]: 2026-01-22 09:59:49.825317997 +0000 UTC m=+0.049134823 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 04:59:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:59:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:59:50 np0005591762 nova_compute[225313]: 2026-01-22 09:59:50.305 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:50 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:51 np0005591762 nova_compute[225313]: 2026-01-22 09:59:51.495 225317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769075976.493941, 615a2021-5fec-4c87-b900-a7adeee0822a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 04:59:51 np0005591762 nova_compute[225313]: 2026-01-22 09:59:51.496 225317 INFO nova.compute.manager [-] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] VM Stopped (Lifecycle Event)#033[00m
Jan 22 04:59:51 np0005591762 nova_compute[225313]: 2026-01-22 09:59:51.510 225317 DEBUG nova.compute.manager [None req-0eb20e8b-fe1f-43e8-ad53-363299732a07 - - - - - -] [instance: 615a2021-5fec-4c87-b900-a7adeee0822a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 04:59:51 np0005591762 nova_compute[225313]: 2026-01-22 09:59:51.514 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:51 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:52 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 04:59:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:53.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 04:59:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:53 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:54.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.675 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.676 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:54 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.689 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.738 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.738 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.743 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.744 225317 INFO nova.compute.claims [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 04:59:54 np0005591762 nova_compute[225313]: 2026-01-22 09:59:54.817 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:55 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 04:59:55 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2180721812' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.159 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.163 225317 DEBUG nova.compute.provider_tree [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.175 225317 DEBUG nova.scheduler.client.report [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.188 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.188 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.218 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.218 225317 DEBUG nova.network.neutron [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.230 225317 INFO nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.239 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 04:59:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 04:59:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.306 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.309 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.310 225317 INFO nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Creating image(s)#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.326 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.341 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.356 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.358 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.369 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.404 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.404 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "9db187949728ea707722fd244d769f131efa8688" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.405 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.405 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "9db187949728ea707722fd244d769f131efa8688" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.420 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.422 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.545 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/9db187949728ea707722fd244d769f131efa8688 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.590 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] resizing rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.643 225317 DEBUG nova.objects.instance [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'migration_context' on Instance uuid 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.660 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.661 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Ensure instance console log exists: /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.661 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.661 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 04:59:55 np0005591762 nova_compute[225313]: 2026-01-22 09:59:55.662 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 04:59:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:55 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:56 np0005591762 nova_compute[225313]: 2026-01-22 09:59:56.172 225317 DEBUG nova.policy [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4428dd9b0fb64c25b8f33b0050d4ef6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 04:59:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:56.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:56 np0005591762 nova_compute[225313]: 2026-01-22 09:59:56.514 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 04:59:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:56 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 04:59:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:57.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:57 np0005591762 nova_compute[225313]: 2026-01-22 09:59:57.368 225317 DEBUG nova.network.neutron [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Successfully created port: e3bebfc5-8852-49d3-8717-292cdf478ad3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 04:59:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:57 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 04:59:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:09:59:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 04:59:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:58 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 04:59:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000021s ======
Jan 22 04:59:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:09:59:59.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000021s
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.331 225317 DEBUG nova.network.neutron [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Successfully updated port: e3bebfc5-8852-49d3-8717-292cdf478ad3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.344 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.344 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquired lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.344 225317 DEBUG nova.network.neutron [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.406 225317 DEBUG nova.compute.manager [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-changed-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.407 225317 DEBUG nova.compute.manager [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Refreshing instance network info cache due to event network-changed-e3bebfc5-8852-49d3-8717-292cdf478ad3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.407 225317 DEBUG oslo_concurrency.lockutils [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 04:59:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 09:59:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 09:59:59 2026: (VI_0) received an invalid passwd!
Jan 22 04:59:59 np0005591762 nova_compute[225313]: 2026-01-22 09:59:59.798 225317 DEBUG nova.network.neutron [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 05:00:00 np0005591762 ceph-mon[75519]: overall HEALTH_WARN 1 failed cephadm daemon(s)
Jan 22 05:00:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:00.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.312 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.389 225317 DEBUG nova.network.neutron [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updating instance_info_cache with network_info: [{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.416 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Releasing lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.417 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Instance network_info: |[{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.417 225317 DEBUG oslo_concurrency.lockutils [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.417 225317 DEBUG nova.network.neutron [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Refreshing network info cache for port e3bebfc5-8852-49d3-8717-292cdf478ad3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.419 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Start _get_guest_xml network_info=[{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'encryption_options': None, 'image_id': 'bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.423 225317 WARNING nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.426 225317 DEBUG nova.virt.libvirt.host [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.427 225317 DEBUG nova.virt.libvirt.host [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.430 225317 DEBUG nova.virt.libvirt.host [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.431 225317 DEBUG nova.virt.libvirt.host [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.431 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.431 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T09:51:32Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6eff66ba-fb3e-4ca7-b05b-920b01d9affd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T09:51:33Z,direct_url=<?>,disk_format='qcow2',id=bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a894ac5b4f744f208fa506d5e8f67970',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T09:51:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.432 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.432 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.432 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.432 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.432 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.432 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.433 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.433 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.433 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.433 225317 DEBUG nova.virt.hardware [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.435 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:00 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 05:00:00 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3800525335' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.773 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.791 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 05:00:00 np0005591762 nova_compute[225313]: 2026-01-22 10:00:00.794 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 22 05:00:01 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1293001222' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.132 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.134 225317 DEBUG nova.virt.libvirt.vif [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:59:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1485096491',display_name='tempest-TestNetworkBasicOps-server-1485096491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1485096491',id=13,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL2pc0TuqUMszQQlwUNPe2IiKPxg1vDES7F+/o5TcvWmUC/ZotaQeC3RN7xlK2WuTBNfjKP2Rvz+n/ptTG376owwHzIG88GabnI7mkpNflPSkppz3Jmj4S18P5x/C9DMEg==',key_name='tempest-TestNetworkBasicOps-1154676976',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-l4f0o2qq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:59:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=1ad2d5a4-5651-4a6e-a0b5-091e04e46df0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.134 225317 DEBUG nova.network.os_vif_util [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.135 225317 DEBUG nova.network.os_vif_util [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.136 225317 DEBUG nova.objects.instance [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.152 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <uuid>1ad2d5a4-5651-4a6e-a0b5-091e04e46df0</uuid>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <name>instance-0000000d</name>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <memory>131072</memory>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <vcpu>1</vcpu>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <metadata>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:name>tempest-TestNetworkBasicOps-server-1485096491</nova:name>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:creationTime>2026-01-22 10:00:00</nova:creationTime>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:flavor name="m1.nano">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:memory>128</nova:memory>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:disk>1</nova:disk>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:swap>0</nova:swap>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:vcpus>1</nova:vcpus>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </nova:flavor>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:owner>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:user uuid="4428dd9b0fb64c25b8f33b0050d4ef6f">tempest-TestNetworkBasicOps-349110285-project-member</nova:user>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:project uuid="05af97dae0f4449ba7eb640bcd3f61e6">tempest-TestNetworkBasicOps-349110285</nova:project>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </nova:owner>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:root type="image" uuid="bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <nova:ports>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <nova:port uuid="e3bebfc5-8852-49d3-8717-292cdf478ad3">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        </nova:port>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </nova:ports>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </nova:instance>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </metadata>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <sysinfo type="smbios">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <system>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <entry name="manufacturer">RDO</entry>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <entry name="product">OpenStack Compute</entry>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <entry name="serial">1ad2d5a4-5651-4a6e-a0b5-091e04e46df0</entry>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <entry name="uuid">1ad2d5a4-5651-4a6e-a0b5-091e04e46df0</entry>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <entry name="family">Virtual Machine</entry>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </system>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </sysinfo>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <os>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <boot dev="hd"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <smbios mode="sysinfo"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </os>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <features>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <acpi/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <apic/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <vmcoreinfo/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </features>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <clock offset="utc">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <timer name="hpet" present="no"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </clock>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <cpu mode="host-model" match="exact">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </cpu>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  <devices>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <disk type="network" device="disk">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </source>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </auth>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <target dev="vda" bus="virtio"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </disk>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <disk type="network" device="cdrom">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <driver type="raw" cache="none"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <source protocol="rbd" name="vms/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk.config">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <host name="192.168.122.100" port="6789"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <host name="192.168.122.102" port="6789"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <host name="192.168.122.101" port="6789"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </source>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <auth username="openstack">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:        <secret type="ceph" uuid="43df7a30-cf5f-5209-adfd-bf44298b19f2"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      </auth>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <target dev="sda" bus="sata"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </disk>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <interface type="ethernet">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <mac address="fa:16:3e:7f:a3:f4"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <mtu size="1442"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <target dev="tape3bebfc5-88"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </interface>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <serial type="pty">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <log file="/var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/console.log" append="off"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </serial>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <video>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <model type="virtio"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </video>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <input type="tablet" bus="usb"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <rng model="virtio">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <backend model="random">/dev/urandom</backend>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </rng>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <controller type="usb" index="0"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    <memballoon model="virtio">
Jan 22 05:00:01 np0005591762 nova_compute[225313]:      <stats period="10"/>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:    </memballoon>
Jan 22 05:00:01 np0005591762 nova_compute[225313]:  </devices>
Jan 22 05:00:01 np0005591762 nova_compute[225313]: </domain>
Jan 22 05:00:01 np0005591762 nova_compute[225313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.153 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Preparing to wait for external event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.154 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.154 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.154 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.155 225317 DEBUG nova.virt.libvirt.vif [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T09:59:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1485096491',display_name='tempest-TestNetworkBasicOps-server-1485096491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1485096491',id=13,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL2pc0TuqUMszQQlwUNPe2IiKPxg1vDES7F+/o5TcvWmUC/ZotaQeC3RN7xlK2WuTBNfjKP2Rvz+n/ptTG376owwHzIG88GabnI7mkpNflPSkppz3Jmj4S18P5x/C9DMEg==',key_name='tempest-TestNetworkBasicOps-1154676976',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-l4f0o2qq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T09:59:55Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=1ad2d5a4-5651-4a6e-a0b5-091e04e46df0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.155 225317 DEBUG nova.network.os_vif_util [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.155 225317 DEBUG nova.network.os_vif_util [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.156 225317 DEBUG os_vif [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.156 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.156 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.157 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.159 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.159 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bebfc5-88, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.160 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape3bebfc5-88, col_values=(('external_ids', {'iface-id': 'e3bebfc5-8852-49d3-8717-292cdf478ad3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:a3:f4', 'vm-uuid': '1ad2d5a4-5651-4a6e-a0b5-091e04e46df0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:01 np0005591762 NetworkManager[48910]: <info>  [1769076001.1616] manager: (tape3bebfc5-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.163 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.165 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.166 225317 INFO os_vif [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88')#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.209 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.209 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.209 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] No VIF found with MAC fa:16:3e:7f:a3:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.210 225317 INFO nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Using config drive#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.224 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 05:00:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:01.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.510 225317 INFO nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Creating config drive at /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/disk.config#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.514 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp36mnl1_5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.632 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp36mnl1_5" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.651 225317 DEBUG nova.storage.rbd_utils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] rbd image 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.653 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/disk.config 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.732 225317 DEBUG oslo_concurrency.processutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/disk.config 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.733 225317 INFO nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Deleting local config drive /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0/disk.config because it was imported into RBD.#033[00m
Jan 22 05:00:01 np0005591762 kernel: tape3bebfc5-88: entered promiscuous mode
Jan 22 05:00:01 np0005591762 NetworkManager[48910]: <info>  [1769076001.7677] manager: (tape3bebfc5-88): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.770 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:01Z|00084|binding|INFO|Claiming lport e3bebfc5-8852-49d3-8717-292cdf478ad3 for this chassis.
Jan 22 05:00:01 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:01Z|00085|binding|INFO|e3bebfc5-8852-49d3-8717-292cdf478ad3: Claiming fa:16:3e:7f:a3:f4 10.100.0.11
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.773 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.776 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.780 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:a3:f4 10.100.0.11'], port_security=['fa:16:3e:7f:a3:f4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1ad2d5a4-5651-4a6e-a0b5-091e04e46df0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80f62cc8-67c6-4829-bc6a-ce04fcaad346', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9e6f05d-1713-46dc-a43a-c893cf31eb9d, chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=e3bebfc5-8852-49d3-8717-292cdf478ad3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.781 143150 INFO neutron.agent.ovn.metadata.agent [-] Port e3bebfc5-8852-49d3-8717-292cdf478ad3 in datapath 69665f9f-c85f-4b95-9e4f-1741fbfc93c0 bound to our chassis#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.782 143150 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69665f9f-c85f-4b95-9e4f-1741fbfc93c0#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.791 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[28c0a2d7-7258-443c-a2d6-912026d92963]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.792 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap69665f9f-c1 in ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 05:00:01 np0005591762 systemd-udevd[234447]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.794 228218 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap69665f9f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.794 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[15e83e00-4d14-446c-bc20-46e0d77fee0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 systemd-machined[193990]: New machine qemu-6-instance-0000000d.
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.797 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[df0848a8-e646-4da4-acd1-7aa3495e485c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 NetworkManager[48910]: <info>  [1769076001.8052] device (tape3bebfc5-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 05:00:01 np0005591762 systemd[1]: Started Virtual Machine qemu-6-instance-0000000d.
Jan 22 05:00:01 np0005591762 NetworkManager[48910]: <info>  [1769076001.8095] device (tape3bebfc5-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.809 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[5a39b256-e799-4fd7-9b4e-b775136da713]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.832 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d38130-305d-477c-941f-188d35dbe894]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.845 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:01Z|00086|binding|INFO|Setting lport e3bebfc5-8852-49d3-8717-292cdf478ad3 ovn-installed in OVS
Jan 22 05:00:01 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:01Z|00087|binding|INFO|Setting lport e3bebfc5-8852-49d3-8717-292cdf478ad3 up in Southbound
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.849 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.855 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[a72caea3-497c-4746-8530-fb794523e231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.859 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[cadbc226-8cee-471d-94a1-c66f4d177ec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 NetworkManager[48910]: <info>  [1769076001.8599] manager: (tap69665f9f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.884 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[23b07990-68df-4e3b-87f8-2a56f24cccd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.886 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[023b2aca-6aca-41b2-b06c-4b800d6e7739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 NetworkManager[48910]: <info>  [1769076001.9026] device (tap69665f9f-c0): carrier: link connected
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.906 228264 DEBUG oslo.privsep.daemon [-] privsep: reply[640b5911-2dfd-4ba9-9e2f-ade6e01eadde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.921 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[24dbdfb0-cdf5-4b91-97bf-44bd206fdb39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69665f9f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:1b:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357945, 'reachable_time': 32410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234471, 'error': None, 'target': 'ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.933 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[de5ed4cb-5201-44e6-845e-8dc90e42487b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:1b31'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357945, 'tstamp': 357945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234472, 'error': None, 'target': 'ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.945 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[5d79e8b7-dfe3-41fd-b61d-dab4531b372b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69665f9f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:1b:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357945, 'reachable_time': 32410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234473, 'error': None, 'target': 'ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.949 225317 DEBUG nova.network.neutron [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updated VIF entry in instance network info cache for port e3bebfc5-8852-49d3-8717-292cdf478ad3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.950 225317 DEBUG nova.network.neutron [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updating instance_info_cache with network_info: [{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.963 225317 DEBUG oslo_concurrency.lockutils [req-28038fab-13a7-4ead-b6dc-a4b24b6d7e14 req-c3c788d8-0056-452e-9a30-b94dfd48e283 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 05:00:01 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:01.966 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b8c4ea-3b7c-4a2b-91f0-2f996f65b84e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.986 225317 DEBUG nova.compute.manager [req-fa845091-8331-404b-9221-e449ee92ad92 req-d27acc02-18ef-497c-b4e8-fb68b1c707ae e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.987 225317 DEBUG oslo_concurrency.lockutils [req-fa845091-8331-404b-9221-e449ee92ad92 req-d27acc02-18ef-497c-b4e8-fb68b1c707ae e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.987 225317 DEBUG oslo_concurrency.lockutils [req-fa845091-8331-404b-9221-e449ee92ad92 req-d27acc02-18ef-497c-b4e8-fb68b1c707ae e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.987 225317 DEBUG oslo_concurrency.lockutils [req-fa845091-8331-404b-9221-e449ee92ad92 req-d27acc02-18ef-497c-b4e8-fb68b1c707ae e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:01 np0005591762 nova_compute[225313]: 2026-01-22 10:00:01.988 225317 DEBUG nova.compute.manager [req-fa845091-8331-404b-9221-e449ee92ad92 req-d27acc02-18ef-497c-b4e8-fb68b1c707ae e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Processing event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.010 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[49923385-204a-4de6-8a4b-05d08bdbaa64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.011 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69665f9f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.011 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.012 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69665f9f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.013 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:02 np0005591762 NetworkManager[48910]: <info>  [1769076002.0135] manager: (tap69665f9f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 22 05:00:02 np0005591762 kernel: tap69665f9f-c0: entered promiscuous mode
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.016 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69665f9f-c0, col_values=(('external_ids', {'iface-id': 'f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.016 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:02 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:02Z|00088|binding|INFO|Releasing lport f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45 from this chassis (sb_readonly=0)
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.030 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.030 143150 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69665f9f-c85f-4b95-9e4f-1741fbfc93c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69665f9f-c85f-4b95-9e4f-1741fbfc93c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.031 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[3875684f-2646-4925-9e01-4692bb44f290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.032 143150 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: global
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    log         /dev/log local0 debug
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    log-tag     haproxy-metadata-proxy-69665f9f-c85f-4b95-9e4f-1741fbfc93c0
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    user        root
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    group       root
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    maxconn     1024
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    pidfile     /var/lib/neutron/external/pids/69665f9f-c85f-4b95-9e4f-1741fbfc93c0.pid.haproxy
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    daemon
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: defaults
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    log global
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    mode http
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    option httplog
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    option dontlognull
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    option http-server-close
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    option forwardfor
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    retries                 3
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    timeout http-request    30s
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    timeout connect         30s
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    timeout client          32s
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    timeout server          32s
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    timeout http-keep-alive 30s
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: listen listener
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    bind 169.254.169.254:80
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]:    http-request add-header X-OVN-Network-ID 69665f9f-c85f-4b95-9e4f-1741fbfc93c0
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 05:00:02 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:02.033 143150 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'env', 'PROCESS_TAG=haproxy-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/69665f9f-c85f-4b95-9e4f-1741fbfc93c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 05:00:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:02.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:02 np0005591762 podman[234537]: 2026-01-22 10:00:02.315157187 +0000 UTC m=+0.035239211 container create 706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 05:00:02 np0005591762 systemd[1]: Started libpod-conmon-706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a.scope.
Jan 22 05:00:02 np0005591762 systemd[1]: Started libcrun container.
Jan 22 05:00:02 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/779a685b4e89fb161f174ba4563447e76a04d306069a981ea30885be5ba0b90b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 05:00:02 np0005591762 podman[234537]: 2026-01-22 10:00:02.361113746 +0000 UTC m=+0.081195782 container init 706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.363 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769076002.3632267, 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.364 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] VM Started (Lifecycle Event)#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.365 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.368 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.371 225317 INFO nova.virt.libvirt.driver [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Instance spawned successfully.#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.371 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 05:00:02 np0005591762 podman[234537]: 2026-01-22 10:00:02.369341121 +0000 UTC m=+0.089423146 container start 706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 05:00:02 np0005591762 podman[234537]: 2026-01-22 10:00:02.301071516 +0000 UTC m=+0.021153561 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.385 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.396 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 05:00:02 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [NOTICE]   (234559) : New worker (234561) forked
Jan 22 05:00:02 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [NOTICE]   (234559) : Loading success.
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.398 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.399 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.399 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.399 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.400 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.400 225317 DEBUG nova.virt.libvirt.driver [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.417 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.417 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769076002.3633962, 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.418 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] VM Paused (Lifecycle Event)#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.435 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.437 225317 DEBUG nova.virt.driver [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] Emitting event <LifecycleEvent: 1769076002.3675961, 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.437 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] VM Resumed (Lifecycle Event)#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.451 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.452 225317 INFO nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Took 7.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.452 225317 DEBUG nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.454 225317 DEBUG nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.472 225317 INFO nova.compute.manager [None req-8796b474-79ce-40df-b660-1f72128ac972 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.497 225317 INFO nova.compute.manager [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Took 7.78 seconds to build instance.#033[00m
Jan 22 05:00:02 np0005591762 nova_compute[225313]: 2026-01-22 10:00:02.506 225317 DEBUG oslo_concurrency.lockutils [None req-4a28a99c-b157-4acb-ac9f-2887ac6d269f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:03.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:04 np0005591762 nova_compute[225313]: 2026-01-22 10:00:04.049 225317 DEBUG nova.compute.manager [req-3afd3176-dacc-41eb-9cdb-d02dec68332c req-275805c2-ecf1-4326-b933-2c1cbde6a1c1 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:04 np0005591762 nova_compute[225313]: 2026-01-22 10:00:04.049 225317 DEBUG oslo_concurrency.lockutils [req-3afd3176-dacc-41eb-9cdb-d02dec68332c req-275805c2-ecf1-4326-b933-2c1cbde6a1c1 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:04 np0005591762 nova_compute[225313]: 2026-01-22 10:00:04.050 225317 DEBUG oslo_concurrency.lockutils [req-3afd3176-dacc-41eb-9cdb-d02dec68332c req-275805c2-ecf1-4326-b933-2c1cbde6a1c1 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:04 np0005591762 nova_compute[225313]: 2026-01-22 10:00:04.050 225317 DEBUG oslo_concurrency.lockutils [req-3afd3176-dacc-41eb-9cdb-d02dec68332c req-275805c2-ecf1-4326-b933-2c1cbde6a1c1 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:04 np0005591762 nova_compute[225313]: 2026-01-22 10:00:04.050 225317 DEBUG nova.compute.manager [req-3afd3176-dacc-41eb-9cdb-d02dec68332c req-275805c2-ecf1-4326-b933-2c1cbde6a1c1 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] No waiting events found dispatching network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 05:00:04 np0005591762 nova_compute[225313]: 2026-01-22 10:00:04.050 225317 WARNING nova.compute.manager [req-3afd3176-dacc-41eb-9cdb-d02dec68332c req-275805c2-ecf1-4326-b933-2c1cbde6a1c1 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received unexpected event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 for instance with vm_state active and task_state None.#033[00m
Jan 22 05:00:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:04.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:05.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:05 np0005591762 nova_compute[225313]: 2026-01-22 10:00:05.312 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:06 np0005591762 nova_compute[225313]: 2026-01-22 10:00:06.162 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:06.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:07.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:08 np0005591762 nova_compute[225313]: 2026-01-22 10:00:08.245 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:08 np0005591762 NetworkManager[48910]: <info>  [1769076008.2472] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 22 05:00:08 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:08Z|00089|binding|INFO|Releasing lport f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45 from this chassis (sb_readonly=0)
Jan 22 05:00:08 np0005591762 NetworkManager[48910]: <info>  [1769076008.2478] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 22 05:00:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:08 np0005591762 nova_compute[225313]: 2026-01-22 10:00:08.283 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:08 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:08Z|00090|binding|INFO|Releasing lport f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45 from this chassis (sb_readonly=0)
Jan 22 05:00:08 np0005591762 nova_compute[225313]: 2026-01-22 10:00:08.286 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:09.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:09 np0005591762 nova_compute[225313]: 2026-01-22 10:00:09.360 225317 DEBUG nova.compute.manager [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-changed-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:09 np0005591762 nova_compute[225313]: 2026-01-22 10:00:09.360 225317 DEBUG nova.compute.manager [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Refreshing instance network info cache due to event network-changed-e3bebfc5-8852-49d3-8717-292cdf478ad3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 05:00:09 np0005591762 nova_compute[225313]: 2026-01-22 10:00:09.360 225317 DEBUG oslo_concurrency.lockutils [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 05:00:09 np0005591762 nova_compute[225313]: 2026-01-22 10:00:09.360 225317 DEBUG oslo_concurrency.lockutils [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 05:00:09 np0005591762 nova_compute[225313]: 2026-01-22 10:00:09.360 225317 DEBUG nova.network.neutron [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Refreshing network info cache for port e3bebfc5-8852-49d3-8717-292cdf478ad3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 05:00:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:10 np0005591762 nova_compute[225313]: 2026-01-22 10:00:10.224 225317 DEBUG nova.network.neutron [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updated VIF entry in instance network info cache for port e3bebfc5-8852-49d3-8717-292cdf478ad3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 05:00:10 np0005591762 nova_compute[225313]: 2026-01-22 10:00:10.224 225317 DEBUG nova.network.neutron [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updating instance_info_cache with network_info: [{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 05:00:10 np0005591762 nova_compute[225313]: 2026-01-22 10:00:10.239 225317 DEBUG oslo_concurrency.lockutils [req-f8ba5601-0ac0-452c-b0bf-3c77abfb594a req-952e8100-2d21-4c9b-a145-2bccedc48e9f e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 05:00:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:10 np0005591762 nova_compute[225313]: 2026-01-22 10:00:10.314 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:11 np0005591762 nova_compute[225313]: 2026-01-22 10:00:11.163 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:11.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:11 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 22 05:00:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:12.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:12 np0005591762 nova_compute[225313]: 2026-01-22 10:00:12.784 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:12 np0005591762 podman[234579]: 2026-01-22 10:00:12.833799284 +0000 UTC m=+0.056253217 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 05:00:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:13.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:13 np0005591762 nova_compute[225313]: 2026-01-22 10:00:13.732 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:14.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:14 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:14Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:a3:f4 10.100.0.11
Jan 22 05:00:14 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:14Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:a3:f4 10.100.0.11
Jan 22 05:00:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:14 np0005591762 nova_compute[225313]: 2026-01-22 10:00:14.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:15.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:15 np0005591762 nova_compute[225313]: 2026-01-22 10:00:15.315 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:16 np0005591762 nova_compute[225313]: 2026-01-22 10:00:16.166 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:16.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:16 np0005591762 nova_compute[225313]: 2026-01-22 10:00:16.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:16 np0005591762 nova_compute[225313]: 2026-01-22 10:00:16.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:00:16 np0005591762 nova_compute[225313]: 2026-01-22 10:00:16.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:00:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.030 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.030 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquired lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.030 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.030 225317 DEBUG nova.objects.instance [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 05:00:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:17.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.737 225317 DEBUG nova.network.neutron [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updating instance_info_cache with network_info: [{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.749 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Releasing lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.749 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.749 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.750 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.750 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:17 np0005591762 nova_compute[225313]: 2026-01-22 10:00:17.750 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:00:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:18.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:19.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:19 np0005591762 nova_compute[225313]: 2026-01-22 10:00:19.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:20.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:20 np0005591762 nova_compute[225313]: 2026-01-22 10:00:20.317 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:20 np0005591762 nova_compute[225313]: 2026-01-22 10:00:20.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:20 np0005591762 podman[234628]: 2026-01-22 10:00:20.836731413 +0000 UTC m=+0.058069453 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.167 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:21.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.588 225317 INFO nova.compute.manager [None req-44bd0a8e-8f72-43c1-b83a-6c649d0f8c15 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Get console output#033[00m
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.592 230487 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 05:00:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.739 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:00:21 np0005591762 nova_compute[225313]: 2026-01-22 10:00:21.739 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:00:22 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3489707096' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.080 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.125 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.126 225317 DEBUG nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 22 05:00:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:22.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.336 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.337 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4650MB free_disk=59.94289016723633GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.337 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.337 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.385 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Instance 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.385 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.385 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:00:22 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:22Z|00091|binding|INFO|Releasing lport f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45 from this chassis (sb_readonly=0)
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.424 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.429 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:22 np0005591762 ceph-osd[77912]: bluestore.MempoolThread fragmentation_score=0.000363 took=0.000023s
Jan 22 05:00:22 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:22Z|00092|binding|INFO|Releasing lport f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45 from this chassis (sb_readonly=0)
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.485 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:00:22 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1561147054' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.766 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.770 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.782 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.798 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:00:22 np0005591762 nova_compute[225313]: 2026-01-22 10:00:22.798 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:23.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:23 np0005591762 nova_compute[225313]: 2026-01-22 10:00:23.587 225317 INFO nova.compute.manager [None req-13c2dbea-5797-4995-835e-93dd06decad7 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Get console output#033[00m
Jan 22 05:00:23 np0005591762 nova_compute[225313]: 2026-01-22 10:00:23.591 230487 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 05:00:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:24.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:25 np0005591762 NetworkManager[48910]: <info>  [1769076025.2158] manager: (patch-br-int-to-provnet-397c94eb-88af-4737-bae3-7adb982d097b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 22 05:00:25 np0005591762 NetworkManager[48910]: <info>  [1769076025.2163] manager: (patch-provnet-397c94eb-88af-4737-bae3-7adb982d097b-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 22 05:00:25 np0005591762 nova_compute[225313]: 2026-01-22 10:00:25.216 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:25 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:25Z|00093|binding|INFO|Releasing lport f1da1ad4-d8a7-4bf3-9975-cd59aaa90c45 from this chassis (sb_readonly=0)
Jan 22 05:00:25 np0005591762 nova_compute[225313]: 2026-01-22 10:00:25.275 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:25 np0005591762 nova_compute[225313]: 2026-01-22 10:00:25.280 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:25 np0005591762 nova_compute[225313]: 2026-01-22 10:00:25.319 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:25 np0005591762 nova_compute[225313]: 2026-01-22 10:00:25.496 225317 INFO nova.compute.manager [None req-994bb854-5ce4-45bb-90aa-49cddcfa1a5f 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Get console output#033[00m
Jan 22 05:00:25 np0005591762 nova_compute[225313]: 2026-01-22 10:00:25.500 230487 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 05:00:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.169 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:26.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.335 225317 DEBUG nova.compute.manager [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-changed-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.336 225317 DEBUG nova.compute.manager [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Refreshing instance network info cache due to event network-changed-e3bebfc5-8852-49d3-8717-292cdf478ad3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.336 225317 DEBUG oslo_concurrency.lockutils [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.336 225317 DEBUG oslo_concurrency.lockutils [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquired lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.336 225317 DEBUG nova.network.neutron [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Refreshing network info cache for port e3bebfc5-8852-49d3-8717-292cdf478ad3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.406 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.406 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.406 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.406 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.407 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.407 225317 INFO nova.compute.manager [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Terminating instance#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.408 225317 DEBUG nova.compute.manager [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 05:00:26 np0005591762 kernel: tape3bebfc5-88 (unregistering): left promiscuous mode
Jan 22 05:00:26 np0005591762 NetworkManager[48910]: <info>  [1769076026.4514] device (tape3bebfc5-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.456 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:26Z|00094|binding|INFO|Releasing lport e3bebfc5-8852-49d3-8717-292cdf478ad3 from this chassis (sb_readonly=0)
Jan 22 05:00:26 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:26Z|00095|binding|INFO|Setting lport e3bebfc5-8852-49d3-8717-292cdf478ad3 down in Southbound
Jan 22 05:00:26 np0005591762 ovn_controller[133622]: 2026-01-22T10:00:26Z|00096|binding|INFO|Removing iface tape3bebfc5-88 ovn-installed in OVS
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.459 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.461 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:a3:f4 10.100.0.11'], port_security=['fa:16:3e:7f:a3:f4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1ad2d5a4-5651-4a6e-a0b5-091e04e46df0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05af97dae0f4449ba7eb640bcd3f61e6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80f62cc8-67c6-4829-bc6a-ce04fcaad346', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9e6f05d-1713-46dc-a43a-c893cf31eb9d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>], logical_port=e3bebfc5-8852-49d3-8717-292cdf478ad3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe7b0d23820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.462 143150 INFO neutron.agent.ovn.metadata.agent [-] Port e3bebfc5-8852-49d3-8717-292cdf478ad3 in datapath 69665f9f-c85f-4b95-9e4f-1741fbfc93c0 unbound from our chassis#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.463 143150 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 69665f9f-c85f-4b95-9e4f-1741fbfc93c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.465 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ab6443-ac52-4260-b693-0e877bf0c68f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.466 143150 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0 namespace which is not needed anymore#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.476 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 22 05:00:26 np0005591762 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Consumed 11.028s CPU time.
Jan 22 05:00:26 np0005591762 systemd-machined[193990]: Machine qemu-6-instance-0000000d terminated.
Jan 22 05:00:26 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [NOTICE]   (234559) : haproxy version is 2.8.14-c23fe91
Jan 22 05:00:26 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [NOTICE]   (234559) : path to executable is /usr/sbin/haproxy
Jan 22 05:00:26 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [WARNING]  (234559) : Exiting Master process...
Jan 22 05:00:26 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [ALERT]    (234559) : Current worker (234561) exited with code 143 (Terminated)
Jan 22 05:00:26 np0005591762 neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0[234555]: [WARNING]  (234559) : All workers exited. Exiting... (0)
Jan 22 05:00:26 np0005591762 systemd[1]: libpod-706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a.scope: Deactivated successfully.
Jan 22 05:00:26 np0005591762 podman[234724]: 2026-01-22 10:00:26.566179221 +0000 UTC m=+0.036521038 container died 706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 05:00:26 np0005591762 podman[234724]: 2026-01-22 10:00:26.584663538 +0000 UTC m=+0.055005353 container cleanup 706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 05:00:26 np0005591762 systemd[1]: var-lib-containers-storage-overlay-779a685b4e89fb161f174ba4563447e76a04d306069a981ea30885be5ba0b90b-merged.mount: Deactivated successfully.
Jan 22 05:00:26 np0005591762 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a-userdata-shm.mount: Deactivated successfully.
Jan 22 05:00:26 np0005591762 systemd[1]: libpod-conmon-706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a.scope: Deactivated successfully.
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.622 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.628 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 podman[234748]: 2026-01-22 10:00:26.630289855 +0000 UTC m=+0.029268401 container remove 706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.631 225317 INFO nova.virt.libvirt.driver [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Instance destroyed successfully.#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.631 225317 DEBUG nova.objects.instance [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lazy-loading 'resources' on Instance uuid 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.636 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[48e10480-afe5-4912-9dc8-606925c2a5fc]: (4, ('Thu Jan 22 10:00:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0 (706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a)\n706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a\nThu Jan 22 10:00:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0 (706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a)\n706ec9e64e63d5b2fb85acb9f24752d8f864b1963259832686b0cdfef52a063a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.637 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[df248814-f8fd-4963-90fc-f4ff36e4b548]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.638 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69665f9f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.640 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 kernel: tap69665f9f-c0: left promiscuous mode
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.646 225317 DEBUG nova.virt.libvirt.vif [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T09:59:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1485096491',display_name='tempest-TestNetworkBasicOps-server-1485096491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1485096491',id=13,image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL2pc0TuqUMszQQlwUNPe2IiKPxg1vDES7F+/o5TcvWmUC/ZotaQeC3RN7xlK2WuTBNfjKP2Rvz+n/ptTG376owwHzIG88GabnI7mkpNflPSkppz3Jmj4S18P5x/C9DMEg==',key_name='tempest-TestNetworkBasicOps-1154676976',keypairs=<?>,launch_index=0,launched_at=2026-01-22T10:00:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05af97dae0f4449ba7eb640bcd3f61e6',ramdisk_id='',reservation_id='r-l4f0o2qq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bb9741cf-1bcc-4b9c-affa-dda3b9a7c93d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-349110285',owner_user_name='tempest-TestNetworkBasicOps-349110285-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T10:00:02Z,user_data=None,user_id='4428dd9b0fb64c25b8f33b0050d4ef6f',uuid=1ad2d5a4-5651-4a6e-a0b5-091e04e46df0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.646 225317 DEBUG nova.network.os_vif_util [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converting VIF {"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.647 225317 DEBUG nova.network.os_vif_util [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.647 225317 DEBUG os_vif [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.648 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.648 225317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bebfc5-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.649 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.651 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.657 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.659 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.660 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[76b6edf3-f421-4e9e-b92f-61931a8c5cf3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.661 225317 INFO os_vif [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:a3:f4,bridge_name='br-int',has_traffic_filtering=True,id=e3bebfc5-8852-49d3-8717-292cdf478ad3,network=Network(69665f9f-c85f-4b95-9e4f-1741fbfc93c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3bebfc5-88')#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.667 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[8323209d-09e1-4f03-b7ca-e2d459101fa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.669 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[f94e09ac-49ff-4d8b-9469-d35e2084a333]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.683 228218 DEBUG oslo.privsep.daemon [-] privsep: reply[6f024dc7-e070-4aa6-8313-459934df34e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357940, 'reachable_time': 44638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234783, 'error': None, 'target': 'ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 systemd[1]: run-netns-ovnmeta\x2d69665f9f\x2dc85f\x2d4b95\x2d9e4f\x2d1741fbfc93c0.mount: Deactivated successfully.
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.686 143537 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-69665f9f-c85f-4b95-9e4f-1741fbfc93c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 05:00:26 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:26.686 143537 DEBUG oslo.privsep.daemon [-] privsep: reply[7b79bb0d-3906-4157-be4c-f423e294b7f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 05:00:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.817 225317 INFO nova.virt.libvirt.driver [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Deleting instance files /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_del#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.818 225317 INFO nova.virt.libvirt.driver [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Deletion of /var/lib/nova/instances/1ad2d5a4-5651-4a6e-a0b5-091e04e46df0_del complete#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.859 225317 INFO nova.compute.manager [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.859 225317 DEBUG oslo.service.loopingcall [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.859 225317 DEBUG nova.compute.manager [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 05:00:26 np0005591762 nova_compute[225313]: 2026-01-22 10:00:26.859 225317 DEBUG nova.network.neutron [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 05:00:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:27.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.333 225317 DEBUG nova.network.neutron [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.350 225317 INFO nova.compute.manager [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Took 0.49 seconds to deallocate network for instance.#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.388 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.388 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.427 225317 DEBUG nova.network.neutron [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updated VIF entry in instance network info cache for port e3bebfc5-8852-49d3-8717-292cdf478ad3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.428 225317 DEBUG nova.network.neutron [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Updating instance_info_cache with network_info: [{"id": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "address": "fa:16:3e:7f:a3:f4", "network": {"id": "69665f9f-c85f-4b95-9e4f-1741fbfc93c0", "bridge": "br-int", "label": "tempest-network-smoke--1321975209", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05af97dae0f4449ba7eb640bcd3f61e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3bebfc5-88", "ovs_interfaceid": "e3bebfc5-8852-49d3-8717-292cdf478ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.433 225317 DEBUG oslo_concurrency.processutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.446 225317 DEBUG oslo_concurrency.lockutils [req-fad80584-0ac5-4daf-9d8d-ddeecf0a1234 req-ed2ce7e5-dd7d-4522-9de5-111473add9f6 e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Releasing lock "refresh_cache-1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 05:00:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:00:27 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3457799441' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.776 225317 DEBUG oslo_concurrency.processutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.343s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.779 225317 DEBUG nova.compute.provider_tree [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.794 225317 DEBUG nova.scheduler.client.report [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.807 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.828 225317 INFO nova.scheduler.client.report [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Deleted allocations for instance 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0#033[00m
Jan 22 05:00:27 np0005591762 nova_compute[225313]: 2026-01-22 10:00:27.880 225317 DEBUG oslo_concurrency.lockutils [None req-5259c783-b6a2-44cf-8f81-a6e42e7203ef 4428dd9b0fb64c25b8f33b0050d4ef6f 05af97dae0f4449ba7eb640bcd3f61e6 - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:28.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.418 225317 DEBUG nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-vif-unplugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.418 225317 DEBUG oslo_concurrency.lockutils [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.418 225317 DEBUG oslo_concurrency.lockutils [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.419 225317 DEBUG oslo_concurrency.lockutils [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.419 225317 DEBUG nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] No waiting events found dispatching network-vif-unplugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.419 225317 WARNING nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received unexpected event network-vif-unplugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.419 225317 DEBUG nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.419 225317 DEBUG oslo_concurrency.lockutils [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Acquiring lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.420 225317 DEBUG oslo_concurrency.lockutils [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.420 225317 DEBUG oslo_concurrency.lockutils [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] Lock "1ad2d5a4-5651-4a6e-a0b5-091e04e46df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.420 225317 DEBUG nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] No waiting events found dispatching network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.420 225317 WARNING nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received unexpected event network-vif-plugged-e3bebfc5-8852-49d3-8717-292cdf478ad3 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.420 225317 DEBUG nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Received event network-vif-deleted-e3bebfc5-8852-49d3-8717-292cdf478ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.421 225317 INFO nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Neutron deleted interface e3bebfc5-8852-49d3-8717-292cdf478ad3; detaching it from the instance and deleting it from the info cache#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.421 225317 DEBUG nova.network.neutron [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 22 05:00:28 np0005591762 nova_compute[225313]: 2026-01-22 10:00:28.422 225317 DEBUG nova.compute.manager [req-ce1fd4ae-3297-4808-8e8a-7e6ba940778f req-201370fb-967f-4a74-a4b9-4b5dc3f00deb e60ff740af6c4003b4590e5dcca11e4e 68e0da8184214c3cb30cd8a6d6c3704d - - default default] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Detach interface failed, port_id=e3bebfc5-8852-49d3-8717-292cdf478ad3, reason: Instance 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 22 05:00:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:29.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:30 np0005591762 nova_compute[225313]: 2026-01-22 10:00:30.151 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:30 np0005591762 nova_compute[225313]: 2026-01-22 10:00:30.231 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:30.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:30 np0005591762 nova_compute[225313]: 2026-01-22 10:00:30.320 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:31.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:31 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:00:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:00:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:00:31 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:00:31 np0005591762 nova_compute[225313]: 2026-01-22 10:00:31.648 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:32.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:33.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:34 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:34.184 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 05:00:34 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:34.185 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 05:00:34 np0005591762 nova_compute[225313]: 2026-01-22 10:00:34.186 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:34.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:34 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:00:34 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:00:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:35 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:35.186 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:00:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:35.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:35 np0005591762 nova_compute[225313]: 2026-01-22 10:00:35.321 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:36.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:36 np0005591762 nova_compute[225313]: 2026-01-22 10:00:36.648 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:37.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:38.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:39.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:40.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:40 np0005591762 nova_compute[225313]: 2026-01-22 10:00:40.323 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:41.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:41 np0005591762 nova_compute[225313]: 2026-01-22 10:00:41.627 225317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769076026.6270442, 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 05:00:41 np0005591762 nova_compute[225313]: 2026-01-22 10:00:41.627 225317 INFO nova.compute.manager [-] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] VM Stopped (Lifecycle Event)#033[00m
Jan 22 05:00:41 np0005591762 nova_compute[225313]: 2026-01-22 10:00:41.641 225317 DEBUG nova.compute.manager [None req-75027fcf-4994-4d7d-a001-e17be9403615 - - - - - -] [instance: 1ad2d5a4-5651-4a6e-a0b5-091e04e46df0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 05:00:41 np0005591762 nova_compute[225313]: 2026-01-22 10:00:41.650 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:42.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:43.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:43 np0005591762 podman[234960]: 2026-01-22 10:00:43.817450811 +0000 UTC m=+0.039586288 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 05:00:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:44.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:00:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:45.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:00:45 np0005591762 nova_compute[225313]: 2026-01-22 10:00:45.325 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:46.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:46 np0005591762 nova_compute[225313]: 2026-01-22 10:00:46.651 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:47.204 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:00:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:47.204 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:00:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:00:47.204 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:00:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:47.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:48.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:49.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:50.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:50 np0005591762 nova_compute[225313]: 2026-01-22 10:00:50.326 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:00:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/541053705' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:00:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:00:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/541053705' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:00:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:51.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:51 np0005591762 nova_compute[225313]: 2026-01-22 10:00:51.652 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:51 np0005591762 podman[234985]: 2026-01-22 10:00:51.828949656 +0000 UTC m=+0.052884486 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 22 05:00:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:52.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:53.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:54.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:55.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:55 np0005591762 nova_compute[225313]: 2026-01-22 10:00:55.326 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:00:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:56.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:00:56 np0005591762 nova_compute[225313]: 2026-01-22 10:00:56.653 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:00:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:00:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:57.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:00:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:00:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:00:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:00:59.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:00:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:00:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:00:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:00:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:00.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:00 np0005591762 nova_compute[225313]: 2026-01-22 10:01:00.328 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:00 np0005591762 ovn_controller[133622]: 2026-01-22T10:01:00Z|00097|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 22 05:01:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:01.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:01 np0005591762 nova_compute[225313]: 2026-01-22 10:01:01.654 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:02.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:03.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:04.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:05.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:05 np0005591762 nova_compute[225313]: 2026-01-22 10:01:05.329 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:06.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:06 np0005591762 nova_compute[225313]: 2026-01-22 10:01:06.656 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:07.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:08.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:09.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:10.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:10 np0005591762 nova_compute[225313]: 2026-01-22 10:01:10.330 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:11.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:11 np0005591762 nova_compute[225313]: 2026-01-22 10:01:11.659 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:12.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:13.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:14.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:14 np0005591762 podman[235093]: 2026-01-22 10:01:14.816023293 +0000 UTC m=+0.034487844 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 05:01:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:15.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:15 np0005591762 nova_compute[225313]: 2026-01-22 10:01:15.332 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:15 np0005591762 nova_compute[225313]: 2026-01-22 10:01:15.795 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:16.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.661 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.765 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.765 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:16 np0005591762 nova_compute[225313]: 2026-01-22 10:01:16.766 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:17.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:17 np0005591762 nova_compute[225313]: 2026-01-22 10:01:17.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:17 np0005591762 nova_compute[225313]: 2026-01-22 10:01:17.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:01:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:18.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:18 np0005591762 systemd-logind[744]: New session 54 of user zuul.
Jan 22 05:01:18 np0005591762 systemd[1]: Started Session 54 of User zuul.
Jan 22 05:01:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:18 np0005591762 nova_compute[225313]: 2026-01-22 10:01:18.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:19.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:20 np0005591762 nova_compute[225313]: 2026-01-22 10:01:20.334 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:20.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:20 np0005591762 nova_compute[225313]: 2026-01-22 10:01:20.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 22 05:01:21 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4239099077' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 22 05:01:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:21.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:21 np0005591762 nova_compute[225313]: 2026-01-22 10:01:21.662 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:22.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:22 np0005591762 nova_compute[225313]: 2026-01-22 10:01:22.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:22 np0005591762 podman[235411]: 2026-01-22 10:01:22.840988102 +0000 UTC m=+0.063911057 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 05:01:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:23.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:23 np0005591762 nova_compute[225313]: 2026-01-22 10:01:23.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:01:23 np0005591762 nova_compute[225313]: 2026-01-22 10:01:23.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:01:23 np0005591762 nova_compute[225313]: 2026-01-22 10:01:23.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:01:23 np0005591762 nova_compute[225313]: 2026-01-22 10:01:23.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:01:23 np0005591762 nova_compute[225313]: 2026-01-22 10:01:23.745 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:01:23 np0005591762 nova_compute[225313]: 2026-01-22 10:01:23.745 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:01:24 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:01:24 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2238400565' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.085 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.339s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.293 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.294 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4767MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.295 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.295 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.341 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:01:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:24.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.342 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.358 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:01:24 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:01:24 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2684068865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.698 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:01:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.702 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.718 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.734 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:01:24 np0005591762 nova_compute[225313]: 2026-01-22 10:01:24.735 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:01:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:25.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:25 np0005591762 nova_compute[225313]: 2026-01-22 10:01:25.336 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:26.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:26 np0005591762 nova_compute[225313]: 2026-01-22 10:01:26.665 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:28 np0005591762 ovs-vsctl[235550]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 05:01:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:28.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:28 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 05:01:28 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 05:01:28 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 05:01:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:29 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: cache status {prefix=cache status} (starting...)
Jan 22 05:01:29 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: client ls {prefix=client ls} (starting...)
Jan 22 05:01:29 np0005591762 lvm[235870]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 05:01:29 np0005591762 lvm[235870]: VG ceph_vg0 finished
Jan 22 05:01:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: damage ls {prefix=damage ls} (starting...)
Jan 22 05:01:30 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 22 05:01:30 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3359393665' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump loads {prefix=dump loads} (starting...)
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 22 05:01:30 np0005591762 nova_compute[225313]: 2026-01-22 10:01:30.336 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 22 05:01:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:30.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:30 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 22 05:01:30 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/246583511' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 22 05:01:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 22 05:01:30 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 22 05:01:31 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: ops {prefix=ops} (starting...)
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2010901809' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2523162821' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 22 05:01:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:31.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:31 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: session ls {prefix=session ls} (starting...)
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/60892485' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 22 05:01:31 np0005591762 nova_compute[225313]: 2026-01-22 10:01:31.665 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:31 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: status {prefix=status} (starting...)
Jan 22 05:01:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4183632675' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 22 05:01:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 22 05:01:32 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/77517271' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 22 05:01:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 22 05:01:32 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3006643406' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 22 05:01:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:32.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 22 05:01:32 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1993693931' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 22 05:01:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 22 05:01:33 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1286897775' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 22 05:01:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 22 05:01:33 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3375501635' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 22 05:01:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:33.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 22 05:01:33 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3984812822' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 22 05:01:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:34.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 22 05:01:34 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/842573669' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 22 05:01:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 22 05:01:34 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2495309826' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.997461 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000036 1 0.000051
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.998398 2 0.000094
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.998594 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.998682 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000022 1 0.000034
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.997719 2 0.000089
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.997885 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.997968 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=61) [2] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000021 1 0.000035
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000979 1 0.001018
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.006776 7 0.000146
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.006417 7 0.000055
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 62 handle_osd_map epochs [62,62], i have 62, src has [1,62]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.005825 7 0.000166
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.006478 7 0.000136
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 62 handle_osd_map epochs [61,62], i have 62, src has [1,62]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.128886 2 0.000044
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive 0.128919 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000141 1 0.000039
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.254459 2 0.000059
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive 0.254491 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000117 1 0.000040
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 DELETING pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.139440 2 0.000097
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete 0.139610 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.f( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started 1.275412 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.336624 2 0.000019
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive 0.336654 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000058 1 0.000061
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.403183 2 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive 0.403211 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000072 1 0.000061
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 DELETING pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.154220 2 0.000125
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete 0.154374 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.3( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started 1.415456 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 DELETING pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.094528 2 0.000131
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete 0.094643 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.b( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started 1.437254 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 DELETING pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.035331 2 0.000194
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete 0.035474 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 62 pg[6.7( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=61) [1] r=-1 lpr=61 pi=[57,61)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started 1.445155 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe116000/0x0/0x4ffc00000, data 0x5db71/0xb4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 64487424 unmapped: 483328 heap: 64970752 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 464629 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1a scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1a scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 62 handle_osd_map epochs [62,63], i have 62, src has [1,63]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=8 mbc={}] exit Started/Stray 1.007364 6 0.000119
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=8 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=8 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] exit Started/Stray 1.006677 6 0.000029
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.008179 6 0.000047
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=2 mbc={}] exit Started/Stray 1.008171 6 0.000026
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=2 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=2 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.008214 6 0.000020
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=3 mbc={}] exit Started/Stray 1.009570 6 0.000157
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=3 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=3 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.008782 6 0.000023
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] exit Started/Stray 1.012249 6 0.000026
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 42'838 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.005471 3 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 42'838 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 42'838 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000043 1 0.000053
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 lc 42'838 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.014686 1 0.000048
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 42'806 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.020702 3 0.000147
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 42'806 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 42'806 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000135 1 0.000080
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 lc 42'806 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.038336 1 0.000074
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 42'808 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=3 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.058899 3 0.000074
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 42'808 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=3 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 42'808 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=3 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000109 1 0.000065
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 lc 42'808 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=3 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.024339 1 0.000075
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 42'890 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.083466 3 0.000155
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 42'890 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 42'890 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000127 1 0.000108
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 lc 42'890 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.038539 1 0.000080
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 42'813 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.123702 3 0.000133
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 42'813 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 42'813 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000070 1 0.000112
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 lc 42'813 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.052690 1 0.000024
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 42'851 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.174736 3 0.000167
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 42'851 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 42'851 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000068 1 0.000057
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 lc 42'851 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.038678 1 0.000050
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 42'832 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.215665 3 0.000240
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 42'832 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 42'832 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000078 1 0.000086
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 lc 42'832 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.059903 1 0.000032
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 42'902 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.270755 3 0.000149
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 42'902 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 42'902 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000080 1 0.000079
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 lc 42'902 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.031471 1 0.000052
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 63 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.527156 1 0.000121
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.610813 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.620514 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.435682 1 0.000109
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.612321 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.619033 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000310 1 0.000378
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.591618 1 0.000064
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.611910 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.620103 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000032 1 0.000051
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.336609 1 0.000125
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.613408 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.620871 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.397745 1 0.000095
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.611366 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.620179 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000979 1 0.001199
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000056 1 0.000075
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.305206 1 0.000057
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000271 1 0.001325
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.552332 1 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.490196 1 0.000135
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.612484 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.620747 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000025 1 0.000057
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.608712 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.621551 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000027 1 0.001657
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.613292 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.621547 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[53,62)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000181 1 0.001924
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000076 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002728 2 0.000402
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 64 handle_osd_map epochs [64,64], i have 64, src has [1,64]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002445 2 0.000022
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002628 2 0.000848
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002873 2 0.000023
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002349 2 0.000185
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.004618 2 0.000023
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.007136 2 0.000180
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.008266 2 0.000043
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=33
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=33
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006041 2 0.000045
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000025 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=18
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=18
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005844 2 0.000086
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=33
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=15
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=15
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004968 2 0.000047
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=33
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005947 2 0.000020
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=57
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=57
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002003 2 0.000036
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=45
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=39
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=45
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=39
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005855 2 0.000047
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006146 2 0.000028
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000014 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002047 2 0.000028
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000013 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 64 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 884736 heap: 66019328 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 64 heartbeat osd_stat(store_statfs(0x4fe105000/0x0/0x4ffc00000, data 0x6427e/0xc5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 64 handle_osd_map epochs [65,65], i have 65, src has [1,65]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994951 2 0.000193
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.003842 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994960 2 0.000253
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.003223 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994998 2 0.000553
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.003812 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994951 2 0.000217
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005341 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.995440 2 0.000034
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004015 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.995313 2 0.000084
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004496 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active+clean] exit Started/Primary/Active/Clean 6.254757 19 0.000091
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] exit Started/Primary/Active 6.666334 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] exit Started/Primary 7.667902 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] exit Started 7.667932 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341123581s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 43'42 active pruub 186.711380005s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] exit Reset 0.000049 1 0.000126
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341094971s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711380005s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.995951 2 0.000141
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004885 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active+clean] exit Started/Primary/Active/Clean 5.958989 19 0.000084
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] exit Started/Primary/Active 6.665630 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] exit Started/Primary 7.670863 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] exit Started 7.670882 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 mlcod 43'42 active mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341254234s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 43'42 active pruub 186.711715698s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] exit Reset 0.000085 1 0.000107
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65 pruub=9.341192245s) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY pruub 186.711715698s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.996320 2 0.000045
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005942 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=62/63 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000041 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000034
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000053 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000134 1 0.000148
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000034 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000219 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002233 4 0.000097
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002351 4 0.000225
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002206 4 0.000212
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.17( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002200 4 0.000118
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.3( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002576 4 0.000476
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.001884 4 0.000443
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.13( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=62/53 les/c/f=63/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.001661 4 0.000043
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002777 4 0.000149
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000024 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.7( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/53 les/c/f=65/54/0 sis=64) [2] r=0 lpr=64 pi=[53,64)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000034
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000055 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000147
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000032 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000146 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000034 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000022
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000160 1 0.000115
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000022 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000206 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000018 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=0 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000012
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000095 1 0.000079
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000268 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000388 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 65 handle_osd_map epochs [65,65], i have 65, src has [1,65]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 65 handle_osd_map epochs [64,65], i have 65, src has [1,65]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 753664 heap: 66019328 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 65 handle_osd_map epochs [65,66], i have 65, src has [1,66]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.998162 2 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.998391 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.998413 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000071 1 0.000099
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.997894 2 0.000299
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.998301 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.998315 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000027 1 0.000048
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 66 handle_osd_map epochs [66,66], i have 66, src has [1,66]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.000623 2 0.000092
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.001036 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.001128 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000134 1 0.000338
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.999468 2 0.000096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.999842 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.999934 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=65) [2] r=0 lpr=65 pi=[53,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000149 1 0.000361
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 66 handle_osd_map epochs [66,66], i have 66, src has [1,66]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 66 handle_osd_map epochs [66,66], i have 66, src has [1,66]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.003670 7 0.000048
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.003591 7 0.000045
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 crt=43'42 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 65527808 unmapped: 491520 heap: 66019328 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.070943 2 0.000040
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive 0.070969 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000047 1 0.000063
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.196622 2 0.000045
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ReplicaActive 0.196652 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000045 1 0.000056
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 DELETING pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.135513 2 0.000093
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete 0.135588 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.d( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started 1.210269 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 DELETING pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.038913 2 0.000060
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started/ToDelete 0.038983 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 66 pg[6.5( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=2 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=65) [1] r=-1 lpr=65 pi=[57,65)/1 luod=0'0 crt=43'42 mlcod 0'0 active mbc={}] exit Started 1.239261 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] exit Started/Stray 1.008214 6 0.000936
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=8 mbc={}] exit Started/Stray 1.008303 6 0.000028
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=8 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=8 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 54'1164 lc 0'0 (0'0,54'1164] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=54'1164 mlcod 0'0 remapped NOTIFY m=8 mbc={}] exit Started/Stray 1.007525 6 0.000033
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 54'1164 lc 0'0 (0'0,54'1164] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=54'1164 mlcod 0'0 remapped NOTIFY m=8 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 54'1164 lc 0'0 (0'0,54'1164] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=54'1164 mlcod 0'0 remapped NOTIFY m=8 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.007529 6 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 42'913 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.002535 3 0.000159
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 42'913 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 42'913 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000086 1 0.000031
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 lc 42'913 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 67 handle_osd_map epochs [67,67], i have 67, src has [1,67]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.029225 1 0.000064
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 42'885 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.031691 3 0.000126
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 42'885 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 42'885 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000043 1 0.000026
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 lc 42'885 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=8 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 66936832 unmapped: 131072 heap: 67067904 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.059809 1 0.000016
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 67'1165 lc 0'0 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped m=8 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.091512 3 0.000100
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 67'1165 lc 0'0 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped m=8 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 67'1165 lc 0'0 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped m=8 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000037 1 0.000041
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 67'1165 lc 0'0 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped m=8 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.059864 1 0.000022
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.151384 3 0.000067
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000047 1 0.000045
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.039502 1 0.000027
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 67 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.970618 1 0.000072
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.002613 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.010893 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000150 1 0.000227
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000045 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.911395 1 0.000020
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.002999 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.011336 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000056 1 0.000099
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.851943 1 0.000077
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.003494 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped mbc={}] exit Started 2.011053 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=54'1164 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.812463 1 0.000031
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.003463 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.011012 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[53,66)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000031 1 0.000053
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 luod=0'0 crt=54'1164 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] exit Reset 0.000445 1 0.000518
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] exit Start 0.000100 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.004558 2 0.000166
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.003113 2 0.000246
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.003826 2 0.000023
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.004670 2 0.000302
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 68 handle_osd_map epochs [68,68], i have 68, src has [1,68]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000746 2 0.000050
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=47
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=47
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000703 2 0.000049
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=36
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=36
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001024 2 0.000029
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=51
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=51
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001058 2 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 68 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 32768 heap: 67067904 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 581580 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 68 handle_osd_map epochs [68,69], i have 68, src has [1,69]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 68 handle_osd_map epochs [68,69], i have 69, src has [1,69]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.001008 2 0.000055
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006390 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.000600 2 0.000039
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006456 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.001668 2 0.000042
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005561 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=66/67 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=54'1164 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.001301 2 0.000035
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006187 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=66/67 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004501 3 0.000166
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=66/53 les/c/f=67/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008372 3 0.000137
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008700 3 0.000368
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009009 3 0.000194
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000035 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000031 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 69 pg[9.5( v 67'1165 (0'0,67'1165] local-lis/les=68/69 n=6 ec=53/30 lis/c=68/53 les/c/f=69/54/0 sis=68) [2] r=0 lpr=68 pi=[53,68)/1 crt=67'1165 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67076096 unmapped: 1040384 heap: 68116480 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 69 handle_osd_map epochs [69,69], i have 69, src has [1,69]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fe0f2000/0x0/0x4ffc00000, data 0x6e6cf/0xd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 983040 heap: 68116480 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.971899986s of 10.221476555s, submitted: 281
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 2007040 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 1998848 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fe0f5000/0x0/0x4ffc00000, data 0x6e6cf/0xd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: mgrc handle_mgr_map Got map version 32
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/1082790531,v1:192.168.122.100:6801/1082790531]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 1892352 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 587798 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fe0f5000/0x0/0x4ffc00000, data 0x6e6cf/0xd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 69 handle_osd_map epochs [69,70], i have 69, src has [1,70]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 70 handle_osd_map epochs [70,71], i have 70, src has [1,71]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 1859584 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.e scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 71 handle_osd_map epochs [71,72], i have 71, src has [1,72]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.e scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 72 ms_handle_reset con 0x55bb1581c000 session 0x55bb156f81e0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 1851392 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 72 heartbeat osd_stat(store_statfs(0x4fe0ed000/0x0/0x4ffc00000, data 0x729c8/0xdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 1867776 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.b scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.b scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 1859584 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 73 handle_osd_map epochs [73,74], i have 73, src has [1,74]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 73 handle_osd_map epochs [73,74], i have 74, src has [1,74]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=0 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000280 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=0 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000024 1 0.000048
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.001304 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000195 1 0.001419
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000044 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000341 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=0 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000075 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=0 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000020 1 0.000129
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000095 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000145 1 0.000257
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000030 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000257 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.0 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 3.0 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 1859584 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 612466 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 74 handle_osd_map epochs [74,75], i have 75, src has [1,75]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.005402 2 0.000276
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.005881 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.006038 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000292 1 0.000394
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000047 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.8( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.007025 2 0.000167
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.007428 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.008779 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=74) [2] r=0 lpr=74 pi=[53,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000064 1 0.000122
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 75 pg[9.18( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 75 handle_osd_map epochs [75,75], i have 75, src has [1,75]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 1884160 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 21.184981 54 0.000479
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 21.200292 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 22.201128 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 22.201161 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=57) [2] r=0 lpr=57 crt=43'42 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806510925s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 active pruub 202.711914062s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] exit Reset 0.000082 1 0.000134
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] exit Start 0.000037 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76 pruub=10.806462288s) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 202.711914062s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] exit Started/Stray 1.019456 6 0.000039
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 76 handle_osd_map epochs [76,76], i have 76, src has [1,76]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] exit Started/Stray 1.020173 6 0.000135
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=0 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000108 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=0 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000018
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000121 1 0.000045
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000023 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000166 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=0 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=0 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000016
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000123 1 0.000058
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000034 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000442 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 42'807 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.002818 3 0.000117
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 42'807 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 42'807 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000152 1 0.000080
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 lc 42'807 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.028718 1 0.000027
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 42'825 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.031464 3 0.000097
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 42'825 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 42'825 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000049 1 0.000056
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 lc 42'825 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.052709 1 0.000043
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 76 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 1572864 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 76 handle_osd_map epochs [76,77], i have 76, src has [1,77]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.986656189s of 10.051264763s, submitted: 51
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 76 handle_osd_map epochs [76,77], i have 77, src has [1,77]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.997791 2 0.000414
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.998347 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.998366 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000053 1 0.000087
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.9( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.915650 1 0.000029
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.999936 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.020203 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000031 1 0.000056
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.969626 1 0.000032
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 77 handle_osd_map epochs [77,77], i have 77, src has [1,77]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.002393 2 0.000059
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.002810 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.002842 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=76) [2] r=0 lpr=76 pi=[53,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.001418 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.023419 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=75) [2]/[0] r=-1 lpr=75 pi=[53,75)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000033 1 0.002580
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000454 1 0.000286
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.19( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.012594 7 0.000090
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.004179 2 0.003934
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000767 2 0.000033
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000511 1 0.000506
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=45
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=45
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000881 2 0.000155
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000858 2 0.000028
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 DELETING pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.002009 1 0.000084
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.002551 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 77 pg[6.9( v 43'42 (0'0,43'42] lb MIN local-lis/les=57/58 n=1 ec=49/17 lis/c=57/57 les/c/f=58/58/0 sis=76) [0] r=-1 lpr=76 pi=[57,76)/1 crt=43'42 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.015201 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 67633152 unmapped: 1531904 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 77 handle_osd_map epochs [77,78], i have 77, src has [1,78]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 77 handle_osd_map epochs [78,78], i have 78, src has [1,78]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.011434 2 0.000052
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.016664 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.011720 2 0.000039
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013387 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=75/76 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.001926 3 0.000130
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.8( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 78 handle_osd_map epochs [78,78], i have 78, src has [1,78]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.019859 6 0.000036
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] exit Started/Stray 1.015713 6 0.000466
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=53/53 les/c/f=54/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=7 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=75/53 les/c/f=76/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.003605 4 0.000072
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.18( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2] r=0 lpr=77 pi=[53,77)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 42'845 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.002817 3 0.000179
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 42'845 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 42'845 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000082 1 0.000092
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 lc 42'845 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.091141 1 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 42'830 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.093779 3 0.000250
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 42'830 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 78 heartbeat osd_stat(store_statfs(0x4fe0d8000/0x0/0x4ffc00000, data 0x7f323/0xf1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 42'830 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000109 1 0.000073
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 lc 42'830 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=7 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.052655 1 0.000020
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 78 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.d scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.d scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 1073152 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 78 heartbeat osd_stat(store_statfs(0x4fe0d4000/0x0/0x4ffc00000, data 0x81736/0xf7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 78 handle_osd_map epochs [78,79], i have 78, src has [1,79]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.908767 1 0.000036
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.002893 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.022776 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000058 1 0.000096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.856721 1 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.003347 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.019206 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=77) [2]/[0] r=-1 lpr=77 pi=[53,77)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000032 1 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 79 handle_osd_map epochs [79,79], i have 79, src has [1,79]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002278 2 0.000039
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.001746 2 0.000028
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=36
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=36
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000988 2 0.000360
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=45
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=45
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001108 2 0.000044
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 79 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 999424 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 672395 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 79 heartbeat osd_stat(store_statfs(0x4fe0d0000/0x0/0x4ffc00000, data 0x8376e/0xfa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.000273 2 0.000078
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.003606 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 80 handle_osd_map epochs [80,80], i have 80, src has [1,80]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 80 handle_osd_map epochs [79,80], i have 80, src has [1,80]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.000985 2 0.000068
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.003889 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=77/78 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.001946 4 0.000115
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.9( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=6 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=77/53 les/c/f=78/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.001451 4 0.000062
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000082 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 80 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/53 les/c/f=80/54/0 sis=79) [2] r=0 lpr=79 pi=[53,79)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fe0d0000/0x0/0x4ffc00000, data 0x8376e/0xfa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 80 handle_osd_map epochs [80,81], i have 80, src has [1,81]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 942080 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 81 handle_osd_map epochs [81,82], i have 81, src has [1,82]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 901120 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 876544 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 868352 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 82 handle_osd_map epochs [82,83], i have 82, src has [1,83]
Jan 22 05:01:34 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.17 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.17 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 868352 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688698 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 83 handle_osd_map epochs [83,84], i have 83, src has [1,84]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 720896 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 85 heartbeat osd_stat(store_statfs(0x4fcf21000/0x0/0x4ffc00000, data 0x8d8e1/0x109000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.b scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.b scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 704512 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.3 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.255112648s of 10.343056679s, submitted: 75
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.3 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68493312 unmapped: 671744 heap: 69165056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 85 handle_osd_map epochs [85,86], i have 85, src has [1,86]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 22.624749 52 0.000148
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 22.633868 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 23.640359 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 23.640414 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374924660s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 active pruub 213.398483276s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 22.625303 52 0.000159
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 22.633719 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 23.639916 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 23.639933 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374703407s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 active pruub 213.398498535s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] exit Reset 0.000046 1 0.000063
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374675751s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398498535s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] exit Reset 0.000496 1 0.000602
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] exit Start 0.000086 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 86 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86 pruub=9.374480247s) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 213.398483276s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 86 handle_osd_map epochs [86,86], i have 86, src has [1,86]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 1646592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.011610 3 0.000255
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.011770 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000113 1 0.000165
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000071 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.012558 3 0.000030
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.012647 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=86) [1] r=-1 lpr=86 pi=[68,86)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000228 1 0.000202
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002471 2 0.000198
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 87 handle_osd_map epochs [87,87], i have 87, src has [1,87]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000024 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002047 2 0.000043
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000013 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 87 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68583424 unmapped: 1630208 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 706268 data_alloc: 218103808 data_used: 0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 87 handle_osd_map epochs [87,88], i have 88, src has [1,88]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.006068 3 0.000081
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.008647 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 88 handle_osd_map epochs [88,88], i have 88, src has [1,88]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=8}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.006714 3 0.000043
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.008810 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] exit Started/Primary/Active/Activating 0.007618 5 0.000710
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=8}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000073 1 0.000067
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=8}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=8}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000345 1 0.000033
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=8}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/Activating 0.010915 5 0.000175
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.061616 2 0.000115
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.058202 1 0.000034
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000464 1 0.000111
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.038163 2 0.000059
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 88 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.265441 1 0.000094
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.226675 1 0.000058
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 0.334650 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 1.343474 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 0.335438 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 1.344150 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 1.343498 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 1.344258 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[68,87)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676091194s) [1] async=[1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 43'1161 active pruub 222.056365967s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] exit Reset 0.000073 1 0.000138
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.676045418s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.056365967s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671806335s) [1] async=[1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 43'1161 active pruub 222.052169800s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] exit Reset 0.000258 1 0.000356
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] exit Start 0.000052 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 89 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89 pruub=15.671649933s) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 222.052169800s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 89 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 89 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68616192 unmapped: 1597440 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 89 heartbeat osd_stat(store_statfs(0x4fcf12000/0x0/0x4ffc00000, data 0x98082/0x118000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 30.011385 76 0.000173
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 30.013666 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 31.019022 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 31.019057 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988652229s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 active pruub 217.372222900s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] exit Reset 0.000148 1 0.000199
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988621712s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372222900s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 30.011784 76 0.000274
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 30.013710 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 31.018608 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 31.018800 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988155365s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 active pruub 217.372268677s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] exit Reset 0.000057 1 0.000112
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90 pruub=9.988128662s) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 217.372268677s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 90 handle_osd_map epochs [90,90], i have 90, src has [1,90]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.012295 7 0.000295
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000066 1 0.000072
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.013504 7 0.000194
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000057 1 0.000065
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 DELETING pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.070549 2 0.000158
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.070687 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.d( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=87/88 n=8 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.083132 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 DELETING pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.106699 2 0.000125
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.106812 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 90 pg[9.1d( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=87/88 n=5 ec=53/30 lis/c=87/68 les/c/f=88/69/0 sis=89) [1] r=-1 lpr=89 pi=[68,89)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.120368 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 1540096 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.004220 3 0.000027
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.004261 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000165 1 0.000209
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000048 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.004189 3 0.000032
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.004228 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=90) [1] r=-1 lpr=90 pi=[64,90)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000170 1 0.000212
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000131 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002452 2 0.000299
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 91 handle_osd_map epochs [91,91], i have 91, src has [1,91]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000069 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000015 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002429 2 0.000313
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000045 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000014 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 91 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 1540096 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 91 heartbeat osd_stat(store_statfs(0x4fcf11000/0x0/0x4ffc00000, data 0x9a1c7/0x118000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.008005 3 0.000169
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.010655 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.008468 3 0.000139
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.011151 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 92 handle_osd_map epochs [91,92], i have 92, src has [1,92]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 1499136 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 92 heartbeat osd_stat(store_statfs(0x4fcf0d000/0x0/0x4ffc00000, data 0x9c1d5/0x11b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/Activating 0.138470 5 0.000612
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000084 1 0.000084
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000339 1 0.000023
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] exit Started/Primary/Active/Activating 0.140722 5 0.000394
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.034762 1 0.000039
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.035518 2 0.000072
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000334 1 0.000038
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.052450 2 0.000086
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 92 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.774762 1 0.000112
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 1.003305 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 2.013996 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 2.014116 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137292862s) [1] async=[1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 43'1161 active pruub 225.539581299s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] exit Reset 0.000109 1 0.000157
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] exit Start 0.000007 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.137216568s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.539581299s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.828157 1 0.000157
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 1.002919 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 2.014120 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 2.014305 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[64,91)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135436058s) [1] async=[1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 43'1161 active pruub 225.538360596s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] exit Reset 0.000223 1 0.000311
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] exit Start 0.000101 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 93 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93 pruub=15.135251045s) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 225.538360596s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 1499136 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 705288 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.015129 7 0.000211
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.016006 7 0.000061
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000101 1 0.000085
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000168 1 0.000117
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 DELETING pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.060586 2 0.000132
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.060721 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.f( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=91/92 n=6 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.076779 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 DELETING pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.097491 2 0.000115
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.097699 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 94 pg[9.1f( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=91/92 n=5 ec=53/30 lis/c=91/64 les/c/f=92/65/0 sis=93) [1] r=-1 lpr=93 pi=[64,93)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.113006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68730880 unmapped: 1482752 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68730880 unmapped: 1482752 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.936411858s of 10.048495293s, submitted: 79
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68780032 unmapped: 1433600 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 94 heartbeat osd_stat(store_statfs(0x4fcf0a000/0x0/0x4ffc00000, data 0xa203b/0x122000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.d scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.d scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 1425408 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 1425408 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 694535 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 1409024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 1409024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 96 heartbeat osd_stat(store_statfs(0x4fcf06000/0x0/0x4ffc00000, data 0xa4127/0x125000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.3 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.3 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68780032 unmapped: 1433600 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 1425408 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.c scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.c scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 1335296 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 713496 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 99 heartbeat osd_stat(store_statfs(0x4fcaec000/0x0/0x4ffc00000, data 0xaa212/0x12e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.4 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.4 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68886528 unmapped: 1327104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fcae8000/0x0/0x4ffc00000, data 0xac1d1/0x131000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fcae7000/0x0/0x4ffc00000, data 0xae173/0x134000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.a deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.066858292s of 10.112045288s, submitted: 76
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.a deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 1269760 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 1245184 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 1236992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 723709 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.d deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.d deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 69042176 unmapped: 1171456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 103 heartbeat osd_stat(store_statfs(0x4fcadc000/0x0/0x4ffc00000, data 0xb4340/0x13d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70115328 unmapped: 1146880 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 104 handle_osd_map epochs [104,105], i have 104, src has [1,105]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 1138688 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 105 heartbeat osd_stat(store_statfs(0x4fcada000/0x0/0x4ffc00000, data 0xb62f7/0x140000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 1277952 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 740313 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 4.14 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 105 heartbeat osd_stat(store_statfs(0x4fcad9000/0x0/0x4ffc00000, data 0xb83eb/0x143000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 50.173248 116 0.001729
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 50.177824 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 51.184268 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 51.184582 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=68) [2] r=0 lpr=68 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827784538s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 active pruub 245.395080566s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] exit Reset 0.000357 1 0.000677
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] exit Start 0.000162 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 106 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106 pruub=13.827475548s) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 245.395080566s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 1245184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 1228800 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 106 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.007997 3 0.000280
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.008244 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=106) [1] r=-1 lpr=106 pi=[68,106)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000057 1 0.000086
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000031 1 0.000036
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000021 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 107 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 107 heartbeat osd_stat(store_statfs(0x4fcad5000/0x0/0x4ffc00000, data 0xba4d7/0x146000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.14 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.977118492s of 10.019715309s, submitted: 43
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.14 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 1212416 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.024153 4 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.024313 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=68/69 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=4}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=0 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=0 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000133 1 0.000044
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000032 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000186 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 108 handle_osd_map epochs [107,108], i have 108, src has [1,108]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=68/68 les/c/f=69/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] exit Started/Primary/Active/Activating 0.002646 5 0.000907
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=4}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000071 1 0.000066
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=4}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=4}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000409 1 0.000047
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=4}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.028339 2 0.000050
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 108 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 1212416 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.978097 1 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 1.009930 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 2.034622 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 2.034647 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=107) [1]/[2] async=[1] r=0 lpr=107 pi=[68,107)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992759705s) [1] async=[1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 43'1161 active pruub 249.603393555s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] exit Reset 0.000091 1 0.000157
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] exit Start 0.000006 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109 pruub=14.992709160s) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 249.603393555s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.009256 2 0.000063
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.009461 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.009484 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=108) [2] r=0 lpr=108 pi=[73,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000050 1 0.000061
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000003 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 109 pg[9.16( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1f deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1f deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 69943296 unmapped: 1318912 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 761376 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] exit Started/Stray 1.017298 6 0.000041
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=73/73 les/c/f=74/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.020611 7 0.000070
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000041 1 0.000031
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 42'842 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.003648 3 0.000101
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 42'842 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 42'842 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000043 1 0.000028
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 lc 42'842 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 DELETING pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.034460 2 0.000144
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.034545 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.15( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=107/108 n=4 ec=53/30 lis/c=107/68 les/c/f=108/69/0 sis=109) [1] r=-1 lpr=109 pi=[68,109)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.055190 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.056272 1 0.000030
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 110 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.710443 1 0.000031
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.770551 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 1.787950 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[73,109)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000276 1 0.000463
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000210 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.003951 2 0.000557
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=27
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000470 2 0.000076
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000015 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 111 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 1236992 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 111 heartbeat osd_stat(store_statfs(0x4fcac5000/0x0/0x4ffc00000, data 0xc4513/0x155000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.002442 2 0.000106
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006964 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=109/110 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=109/73 les/c/f=110/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=111/73 les/c/f=112/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.000913 3 0.000102
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=111/73 les/c/f=112/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=111/73 les/c/f=112/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 112 pg[9.16( v 43'1161 (0'0,43'1161] local-lis/les=111/112 n=5 ec=53/30 lis/c=111/73 les/c/f=112/74/0 sis=111) [2] r=0 lpr=111 pi=[73,111)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 112 handle_osd_map epochs [111,112], i have 112, src has [1,112]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 1212416 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 1179648 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 1163264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 112 heartbeat osd_stat(store_statfs(0x4fcac1000/0x0/0x4ffc00000, data 0xc64b5/0x158000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1c deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.1c deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 1155072 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 774902 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 1155072 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 1138688 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.11 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.945323944s of 10.010906219s, submitted: 53
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.11 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70139904 unmapped: 1122304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=79) [2] r=0 lpr=79 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 47.743464 108 0.000319
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=79) [2] r=0 lpr=79 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 47.745081 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=79) [2] r=0 lpr=79 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 48.748995 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=79) [2] r=0 lpr=79 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 48.749031 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=79) [2] r=0 lpr=79 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256608009s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 active pruub 251.942764282s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] exit Reset 0.000400 1 0.000505
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] exit Start 0.000116 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 115 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=8.256252289s) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 251.942764282s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 115 heartbeat osd_stat(store_statfs(0x4fcabd000/0x0/0x4ffc00000, data 0xca68d/0x15e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70139904 unmapped: 1122304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.002803 3 0.000204
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.002968 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=115) [0] r=-1 lpr=115 pi=[79,115)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000044 1 0.000064
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.002057 2 0.000038
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000035 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000037 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 116 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 116 heartbeat osd_stat(store_statfs(0x4fcab9000/0x0/0x4ffc00000, data 0xcc779/0x161000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.12 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.12 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70156288 unmapped: 1105920 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 789422 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.001990 3 0.000178
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.004200 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=79/80 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=79/79 les/c/f=80/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] exit Started/Primary/Active/Activating 0.002129 5 0.000592
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000054 1 0.000073
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000444 1 0.000083
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=7}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 117 handle_osd_map epochs [116,117], i have 117, src has [1,117]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.049489 2 0.000057
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 117 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 117 handle_osd_map epochs [117,118], i have 117, src has [1,118]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.655948 1 0.000042
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 0.708421 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 1.712636 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 1.712658 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=116) [0]/[2] async=[0] r=0 lpr=116 pi=[79,116)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293571472s) [0] async=[0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 43'1161 active pruub 261.695831299s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] exit Reset 0.000083 1 0.000132
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 118 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.293519020s) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 261.695831299s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 1056768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 118 heartbeat osd_stat(store_statfs(0x4fcaaf000/0x0/0x4ffc00000, data 0xd2811/0x16a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 118 handle_osd_map epochs [118,119], i have 118, src has [1,119]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 70.033271 168 0.000430
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active 70.034993 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary 71.040957 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] exit Started 71.040994 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=64) [2] r=0 lpr=64 crt=43'1161 mlcod 0'0 active mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967482567s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 active pruub 257.373352051s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] exit Reset 0.000170 1 0.000257
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] exit Start 0.000042 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119 pruub=9.967348099s) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 257.373352051s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.008843 7 0.000069
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000047 1 0.000070
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 DELETING pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.064074 2 0.000131
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.064172 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 119 pg[9.19( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=116/117 n=5 ec=53/30 lis/c=116/79 les/c/f=117/80/0 sis=118) [0] r=-1 lpr=118 pi=[79,118)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.073055 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70213632 unmapped: 1048576 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 119 handle_osd_map epochs [119,120], i have 119, src has [1,120]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.004251 3 0.000114
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.004512 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=119) [0] r=-1 lpr=119 pi=[64,119)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Reset 0.000402 1 0.000778
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] exit Start 0.000109 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.004094 2 0.000251
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000048 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000014 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 120 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.999230 3 0.000116
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 1.003471 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=64/65 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 activating+remapped mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 121 handle_osd_map epochs [120,121], i have 121, src has [1,121]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=64/64 les/c/f=65/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.005485 5 0.000176
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000071 1 0.000040
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000349 1 0.000025
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 0.015332 2 0.000036
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 121 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 121 handle_osd_map epochs [121,122], i have 121, src has [1,122]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 0.982892 1 0.000144
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary/Active 1.004297 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started/Primary 2.007793 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] exit Started 2.007982 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[64,120)/1 crt=43'1161 mlcod 43'1161 active+remapped mbc={255={}}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001114845s) [0] async=[0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 43'1161 active pruub 265.420257568s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] exit Reset 0.000068 1 0.000114
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] exit Start 0.000005 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 122 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122 pruub=15.001076698s) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY pruub 265.420257568s@ mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 802314 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.017197 7 0.000109
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000096 1 0.000114
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 DELETING pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.016134 2 0.000157
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.016305 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 123 pg[9.1b( v 43'1161 (0'0,43'1161] lb MIN local-lis/les=120/121 n=5 ec=53/30 lis/c=120/64 les/c/f=121/65/0 sis=122) [0] r=-1 lpr=122 pi=[64,122)/1 crt=43'1161 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.033558 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 1892352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 123 heartbeat osd_stat(store_statfs(0x4fcaa4000/0x0/0x4ffc00000, data 0xdc789/0x177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 1884160 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 1867776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.885115623s of 10.964232445s, submitted: 66
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 1859584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 1851392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 803964 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 124 heartbeat osd_stat(store_statfs(0x4fcaa1000/0x0/0x4ffc00000, data 0xde875/0x17a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70483968 unmapped: 1826816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 124 heartbeat osd_stat(store_statfs(0x4fcaa1000/0x0/0x4ffc00000, data 0xde875/0x17a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70492160 unmapped: 1818624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d(unlocked)] enter Initial
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=0 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000060 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=0 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000033
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000010 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000225 1 0.000042
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000032 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000299 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 125 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 1777664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 125 heartbeat osd_stat(store_statfs(0x4fca9e000/0x0/0x4ffc00000, data 0xe0961/0x17d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.010701 2 0.000107
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.011055 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.011094 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=125) [2] r=0 lpr=125 pi=[89,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000179 1 0.000252
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000270 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 126 pg[9.1d( empty local-lis/les=0/0 n=0 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 126 handle_osd_map epochs [125,126], i have 126, src has [1,126]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 1761280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.023101 6 0.000627
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 0'0 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=89/89 les/c/f=90/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 crt=43'1161 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.002072 3 0.000183
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000086 1 0.000183
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 lc 42'863 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.035786 1 0.000035
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 127 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 1761280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831740 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.963495 1 0.000073
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.001701 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] exit Started 2.025326 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=126) [2]/[1] r=-1 lpr=126 pi=[89,126)/1 luod=0'0 crt=43'1161 mlcod 0'0 active+remapped mbc={}] enter Reset
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 luod=0'0 crt=43'1161 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Reset 0.000305 1 0.000390
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Start
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] exit Start 0.000090 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.006279 2 0.000191
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=0/0 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: merge_log_dups log.dups.size()=0olog.dups.size()=36
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=36
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001023 2 0.000054
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 128 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 1761280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 129 handle_osd_map epochs [128,129], i have 129, src has [1,129]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.001498 2 0.000051
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.008860 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=126/127 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=126/89 les/c/f=127/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=128/89 les/c/f=129/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.001625 4 0.000087
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=128/89 les/c/f=129/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=128/89 les/c/f=129/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000025 0 0.000000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 pg_epoch: 129 pg[9.1d( v 43'1161 (0'0,43'1161] local-lis/les=128/129 n=5 ec=53/30 lis/c=128/89 les/c/f=129/90/0 sis=128) [2] r=0 lpr=128 pi=[89,128)/1 crt=43'1161 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 1761280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 1761280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.975269318s of 10.034676552s, submitted: 47
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 129 heartbeat osd_stat(store_statfs(0x4fca91000/0x0/0x4ffc00000, data 0xe8925/0x18b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 1753088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 1761280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842153 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.b scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.b scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 1736704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 1728512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 1712128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.13 deep-scrub starts
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: log_channel(cluster) log [DBG] : 9.13 deep-scrub ok
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 1662976 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 135 heartbeat osd_stat(store_statfs(0x4fca81000/0x0/0x4ffc00000, data 0xf2aa6/0x19a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 1654784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 865903 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 1638400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7d000/0x0/0x4ffc00000, data 0xf4a5d/0x19d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 1630208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 1622016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 1622016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 1613824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 1613824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 1613824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 1589248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 1589248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 1581056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 1581056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 1581056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 1572864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 1572864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 1564672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 1564672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 1556480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 1548288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 1548288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 1540096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 1540096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 1531904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 1531904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 1523712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 1523712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 1523712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 1507328 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 1490944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1482752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1482752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1482752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1482752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1482752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 1474560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 1474560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 1466368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 1466368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 1466368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70852608 unmapped: 1458176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70852608 unmapped: 1458176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 1441792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1417216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1417216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 1409024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 1409024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 1400832 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 1400832 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 1384448 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 1376256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 1376256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 1368064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 1359872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 1335296 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 1335296 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 1335296 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 1327104 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 1327104 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 1318912 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 1318912 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb166d7400 session 0x55bb1691bc20
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 1318912 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 1310720 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 1302528 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 1302528 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 1294336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 1294336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869593 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 1294336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb14a99400 session 0x55bb166e8000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 1261568 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1253376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 75.985519409s of 76.027488708s, submitted: 33
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca79000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 1236992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 867357 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 1236992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 1228800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 139264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 131072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 867373 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 114688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 114688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 65536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 57344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 57344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869017 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.510575294s of 10.519902229s, submitted: 9
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 8192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72310784 unmapped: 0 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 1040384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72368128 unmapped: 991232 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72368128 unmapped: 991232 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869017 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 983040 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 917504 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 917504 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 909312 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 901120 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868278 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb15839c00 session 0x55bb16862f00
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 901120 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72474624 unmapped: 884736 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.742706299s of 12.752790451s, submitted: 11
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 860160 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 860160 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 851968 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868146 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 835584 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72531968 unmapped: 827392 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72548352 unmapped: 811008 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72548352 unmapped: 811008 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 802816 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868146 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 802816 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72572928 unmapped: 786432 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72572928 unmapped: 786432 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72572928 unmapped: 786432 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.296337128s of 11.299195290s, submitted: 3
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 871318 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169bac00 session 0x55bb15561c20
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 679936 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72687616 unmapped: 671744 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 663552 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 871318 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72687616 unmapped: 671744 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 663552 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 655360 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 647168 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72720384 unmapped: 638976 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 871318 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72720384 unmapped: 638976 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb166d9800 session 0x55bb16398000
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 630784 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.930956841s of 12.938614845s, submitted: 8
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 871302 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 557056 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 557056 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 871318 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 532480 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 499712 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72867840 unmapped: 491520 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72867840 unmapped: 491520 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.904683113s of 12.913397789s, submitted: 9
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 870859 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72916992 unmapped: 442368 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 434176 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873160 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 74014720 unmapped: 393216 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 385024 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 385024 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72990720 unmapped: 1417216 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169bc400 session 0x55bb16399a40
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 72998912 unmapped: 1409024 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873144 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73007104 unmapped: 1400832 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.120276451s of 11.132685661s, submitted: 14
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73023488 unmapped: 1384448 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73031680 unmapped: 1376256 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73031680 unmapped: 1376256 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73039872 unmapped: 1368064 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873012 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73048064 unmapped: 1359872 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73056256 unmapped: 1351680 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73056256 unmapped: 1351680 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73064448 unmapped: 1343488 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 1318912 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873144 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 1318912 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 1310720 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.849093437s of 11.852244377s, submitted: 3
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 1294336 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 1294336 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 1286144 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874672 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 1277952 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 1269760 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 1269760 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 1253376 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73162752 unmapped: 1245184 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875425 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 1236992 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 1236992 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 1228800 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874986 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 1212416 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.579069138s of 14.590522766s, submitted: 11
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 1196032 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874854 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c4c00 session 0x55bb166e8960
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 1187840 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 1179648 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 1179648 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 1179648 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 1171456 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874854 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 1171456 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 1163264 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 1163264 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874854 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 1146880 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 1146880 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.621748924s of 14.622872353s, submitted: 1
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73269248 unmapped: 1138688 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874986 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 1114112 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73302016 unmapped: 1105920 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 1089536 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876514 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73334784 unmapped: 1073152 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.005198479s of 12.014612198s, submitted: 11
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875316 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c0c00 session 0x55bb15561a40
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875184 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875184 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.628890991s of 13.631286621s, submitted: 2
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875316 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876844 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876237 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.120885849s of 17.131437302s, submitted: 12
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73547776 unmapped: 860160 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73547776 unmapped: 860160 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73547776 unmapped: 860160 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73572352 unmapped: 835584 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 819200 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 819200 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 811008 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 811008 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 802816 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 802816 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 802816 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 770048 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 761856 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 761856 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 696320 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 671744 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 663552 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c3c00 session 0x55bb146aaf00
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 647168 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 647168 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 647168 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 614400 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 606208 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 606208 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.835266113s of 55.836685181s, submitted: 1
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 606208 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 598016 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73818112 unmapped: 589824 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 877765 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 507904 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878518 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 507904 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 507904 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 491520 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 491520 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 483328 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878670 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 483328 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.329652786s of 16.341135025s, submitted: 13
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 74989568 unmapped: 466944 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75014144 unmapped: 442368 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75014144 unmapped: 442368 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75022336 unmapped: 434176 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75022336 unmapped: 434176 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75030528 unmapped: 425984 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75055104 unmapped: 401408 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75055104 unmapped: 401408 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75055104 unmapped: 401408 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75063296 unmapped: 393216 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75063296 unmapped: 393216 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75063296 unmapped: 393216 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75071488 unmapped: 385024 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75087872 unmapped: 368640 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75096064 unmapped: 360448 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75096064 unmapped: 360448 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e30400 session 0x55bb1691bc20
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75096064 unmapped: 360448 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75104256 unmapped: 352256 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75104256 unmapped: 352256 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5918 writes, 25K keys, 5918 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5918 writes, 1040 syncs, 5.69 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5918 writes, 25K keys, 5918 commit groups, 1.0 writes per commit group, ingest: 19.22 MB, 0.03 MB/s#012Interval WAL: 5918 writes, 1040 syncs, 5.69 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75210752 unmapped: 245760 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.951286316s of 33.952785492s, submitted: 1
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75227136 unmapped: 229376 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75235328 unmapped: 221184 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75251712 unmapped: 204800 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75284480 unmapped: 172032 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75292672 unmapped: 163840 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880198 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75292672 unmapped: 163840 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75317248 unmapped: 139264 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75325440 unmapped: 131072 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75325440 unmapped: 131072 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75350016 unmapped: 106496 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879439 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75358208 unmapped: 98304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75358208 unmapped: 98304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.006448746s of 12.019389153s, submitted: 12
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75382784 unmapped: 73728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75382784 unmapped: 73728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75382784 unmapped: 73728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879000 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75390976 unmapped: 65536 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75407360 unmapped: 49152 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75415552 unmapped: 40960 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75415552 unmapped: 40960 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75423744 unmapped: 32768 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75431936 unmapped: 24576 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1015808 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75825152 unmapped: 679936 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 671744 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 663552 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 663552 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 663552 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 655360 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 655360 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 647168 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 647168 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 638976 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 638976 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16ad9400 session 0x55bb155612c0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 638976 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 630784 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.670803070s of 32.730369568s, submitted: 84
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 77168640 unmapped: 1433600 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879000 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.134506226s of 12.238805771s, submitted: 166
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78454784 unmapped: 147456 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78454784 unmapped: 147456 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78454784 unmapped: 147456 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880528 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 131072 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 131072 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 122880 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 122880 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78487552 unmapped: 114688 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879769 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 106496 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 106496 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78503936 unmapped: 98304 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78503936 unmapped: 98304 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 90112 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879921 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.997126579s of 13.009223938s, submitted: 9
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 90112 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb14672c00 session 0x55bb172f65a0
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 90112 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78520320 unmapped: 81920 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 65536 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78544896 unmapped: 57344 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879789 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78544896 unmapped: 57344 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78544896 unmapped: 57344 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78553088 unmapped: 49152 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78553088 unmapped: 49152 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 40960 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879789 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 40960 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 32768 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.138556480s of 12.139929771s, submitted: 1
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78577664 unmapped: 24576 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78577664 unmapped: 24576 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78585856 unmapped: 16384 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879921 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78610432 unmapped: 1040384 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78610432 unmapped: 1040384 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 1032192 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78635008 unmapped: 1015808 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78643200 unmapped: 1007616 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882961 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78651392 unmapped: 999424 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78659584 unmapped: 991232 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78667776 unmapped: 983040 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78667776 unmapped: 983040 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78675968 unmapped: 974848 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882354 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78684160 unmapped: 966656 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78684160 unmapped: 966656 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb1581c400 session 0x55bb147a3680
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78692352 unmapped: 958464 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78692352 unmapped: 958464 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.831901550s of 16.840600967s, submitted: 12
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78700544 unmapped: 950272 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882222 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78700544 unmapped: 950272 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 942080 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 942080 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 942080 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78725120 unmapped: 925696 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882222 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78725120 unmapped: 925696 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78733312 unmapped: 917504 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78733312 unmapped: 917504 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:34 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78749696 unmapped: 901120 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882354 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.236344337s of 12.239408493s, submitted: 3
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 860160 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb1466e400 session 0x55bb147a30e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 827392 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885394 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885394 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78848000 unmapped: 802816 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78848000 unmapped: 802816 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.904095650s of 10.914563179s, submitted: 9
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884787 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [1])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884803 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.533043861s of 11.542860031s, submitted: 10
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884044 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883605 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78905344 unmapped: 745472 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883473 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c7800 session 0x55bb168632c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c0400 session 0x55bb1641c000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883473 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883473 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.116767883s of 25.120988846s, submitted: 4
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883737 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 712704 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885265 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 663552 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885097 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.843386650s of 15.856459618s, submitted: 13
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884985 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e31000 session 0x55bb166e8000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884985 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884985 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.390679359s of 11.392770767s, submitted: 2
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 1687552 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 1687552 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 1687552 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 1679360 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886645 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e32c00 session 0x55bb146ab4a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 1679360 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 1679360 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886645 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.008318901s of 12.015848160s, submitted: 9
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886038 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 1638400 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 1638400 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887566 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 1638400 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 1613824 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 1613824 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 1613824 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.801233292s of 11.814348221s, submitted: 12
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889078 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888471 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169bd800 session 0x55bb1796e5a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 84.316741943s of 84.330902100s, submitted: 3
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888471 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889999 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79183872 unmapped: 1515520 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79200256 unmapped: 1499136 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79200256 unmapped: 1499136 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890752 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.004010201s of 12.014162064s, submitted: 12
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79224832 unmapped: 1474560 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890313 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 1441792 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 1441792 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e2f800 session 0x55bb146aba40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e30c00 session 0x55bb146aa780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 110.737770081s of 110.740150452s, submitted: 2
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79405056 unmapped: 1294336 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890461 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79437824 unmapped: 1261568 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79437824 unmapped: 1261568 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79437824 unmapped: 1261568 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79454208 unmapped: 1245184 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79470592 unmapped: 1228800 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890461 data_alloc: 218103808 data_used: 4096
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.267648697s of 10.278943062s, submitted: 11
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79486976 unmapped: 1212416 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889263 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888999 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf8af3/0x1a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fbe03000/0x0/0x4ffc00000, data 0xd6ac61/0xe18000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 17768448 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.349076271s of 10.382431030s, submitted: 31
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 139 ms_handle_reset con 0x55bb166d8000 session 0x55bb179d8780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 17768448 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79839232 unmapped: 17645568 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 140 ms_handle_reset con 0x55bb166d9c00 session 0x55bb179952c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79847424 unmapped: 17637376 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1071604 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79855616 unmapped: 17629184 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79855616 unmapped: 17629184 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fb18c000/0x0/0x4ffc00000, data 0x19dee94/0x1a8f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79872000 unmapped: 17612800 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fb18c000/0x0/0x4ffc00000, data 0x19dee94/0x1a8f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79872000 unmapped: 17612800 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074530 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074530 data_alloc: 218103808 data_used: 8192
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 ms_handle_reset con 0x55bb166d6000 session 0x55bb152c1e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 ms_handle_reset con 0x55bb169c5c00 session 0x55bb155614a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 91545600 unmapped: 5939200 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.163917542s of 39.186256409s, submitted: 33
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 91545600 unmapped: 5939200 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb14673c00 session 0x55bb172f7680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d6000 session 0x55bb169230e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d8000 session 0x55bb16862000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d9c00 session 0x55bb179243c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb169c5c00 session 0x55bb152c01e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fb186000/0x0/0x4ffc00000, data 0x19e2f52/0x1a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb16c50c00 session 0x55bb163f3e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167932 data_alloc: 234881024 data_used: 11485184
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fac61000/0x0/0x4ffc00000, data 0x1f050f4/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d6000 session 0x55bb16526000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb13e61400 session 0x55bb156f9c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb1466fc00 session 0x55bb176cd4a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93052928 unmapped: 8765440 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93069312 unmapped: 8749056 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201211 data_alloc: 234881024 data_used: 15548416
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac3a000/0x0/0x4ffc00000, data 0x1f2b0e9/0x1fe1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac3a000/0x0/0x4ffc00000, data 0x1f2b0e9/0x1fe1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201211 data_alloc: 234881024 data_used: 15548416
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.756557465s of 16.806152344s, submitted: 91
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 99557376 unmapped: 2260992 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103743488 unmapped: 4366336 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f91a3000/0x0/0x4ffc00000, data 0x28140e9/0x28ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f91a3000/0x0/0x4ffc00000, data 0x28140e9/0x28ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279961 data_alloc: 234881024 data_used: 15925248
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb14a99400 session 0x55bb17938f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9175000/0x0/0x4ffc00000, data 0x28510e9/0x2907000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9175000/0x0/0x4ffc00000, data 0x28510e9/0x2907000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31c00 session 0x55bb17939860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279145 data_alloc: 234881024 data_used: 15925248
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9173000/0x0/0x4ffc00000, data 0x28530e9/0x2909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9173000/0x0/0x4ffc00000, data 0x28530e9/0x2909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279145 data_alloc: 234881024 data_used: 15925248
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.602156639s of 12.680288315s, submitted: 136
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9171000/0x0/0x4ffc00000, data 0x28550e9/0x290b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9171000/0x0/0x4ffc00000, data 0x28550e9/0x290b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278989 data_alloc: 234881024 data_used: 15929344
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad8c00 session 0x55bb156f8960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6000 session 0x55bb156f90e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c1800 session 0x55bb1691a1e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb1691bc20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105324544 unmapped: 2785280 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be000 session 0x55bb1691a780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581cc00 session 0x55bb16526780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb16526f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be000 session 0x55bb163985a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c1800 session 0x55bb16399c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6000 session 0x55bb16398b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb16399a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3f000/0x0/0x4ffc00000, data 0x2e860f9/0x2f3d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1332237 data_alloc: 234881024 data_used: 16453632
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33400 session 0x55bb16394b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.629212379s of 10.656615257s, submitted: 21
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4c00 session 0x55bb16395e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb1647cf00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c3800 session 0x55bb172f70e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3e000/0x0/0x4ffc00000, data 0x2e86109/0x2f3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105488384 unmapped: 6955008 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105504768 unmapped: 6938624 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336136 data_alloc: 234881024 data_used: 16490496
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3e000/0x0/0x4ffc00000, data 0x2e86109/0x2f3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 1572864 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1379372 data_alloc: 234881024 data_used: 22847488
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 1572864 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.993770599s of 10.007340431s, submitted: 18
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3c000/0x0/0x4ffc00000, data 0x2e87109/0x2f3f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110903296 unmapped: 1540096 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110903296 unmapped: 1540096 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4292608 heap: 117686272 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118677504 unmapped: 3211264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1520512 data_alloc: 234881024 data_used: 23384064
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118677504 unmapped: 3211264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118677504 unmapped: 3211264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7643000/0x0/0x4ffc00000, data 0x3f69109/0x4021000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 3178496 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 3178496 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 3178496 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1520512 data_alloc: 234881024 data_used: 23384064
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 4841472 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7648000/0x0/0x4ffc00000, data 0x3f6c109/0x4024000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb179f2780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.464450836s of 10.549235344s, submitted: 131
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb165472c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 4833280 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16a5b800 session 0x55bb1691a5a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 10584064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 10584064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8d60000/0x0/0x4ffc00000, data 0x28560e9/0x290c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 10584064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296941 data_alloc: 234881024 data_used: 16453632
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51c00 session 0x55bb176cc5a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d8400 session 0x55bb1796e780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109182976 unmapped: 12705792 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33c00 session 0x55bb1691a1e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148353 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb147a21e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148353 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148353 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.459398270s of 20.503299713s, submitted: 84
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147901 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 12599296 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb146b1a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31800 session 0x55bb172f70e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108806144 unmapped: 13082624 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108806144 unmapped: 13082624 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108814336 unmapped: 13074432 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb146aa3c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108257280 unmapped: 13631488 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1191849 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f989a000/0x0/0x4ffc00000, data 0x1d1d0c6/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f989a000/0x0/0x4ffc00000, data 0x1d1d0c6/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.978419304s of 12.022029877s, submitted: 57
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1204922 data_alloc: 234881024 data_used: 13922304
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f989a000/0x0/0x4ffc00000, data 0x1d1d0c6/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 114130944 unmapped: 7757824 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276110 data_alloc: 234881024 data_used: 14196736
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8fa4000/0x0/0x4ffc00000, data 0x26050c6/0x26ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 10461184 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284076 data_alloc: 234881024 data_used: 14118912
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.410071373s of 12.487000465s, submitted: 130
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284092 data_alloc: 234881024 data_used: 14118912
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284092 data_alloc: 234881024 data_used: 14118912
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284244 data_alloc: 234881024 data_used: 14123008
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.180794716s of 14.182448387s, submitted: 1
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb172f6960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb13e825a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111165440 unmapped: 10723328 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb146b1c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109461504 unmapped: 12427264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 7997 writes, 31K keys, 7997 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7997 writes, 1974 syncs, 4.05 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2079 writes, 6139 keys, 2079 commit groups, 1.0 writes per commit group, ingest: 6.38 MB, 0.01 MB/s#012Interval WAL: 2079 writes, 934 syncs, 2.23 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memta
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109461504 unmapped: 12427264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109461504 unmapped: 12427264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb17938780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6800 session 0x55bb1691a960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb17938f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb16922000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb147a2960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb163f2f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bfc00 session 0x55bb156f9c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.919490814s of 19.946268082s, submitted: 38
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb176cc780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e36000/0x0/0x4ffc00000, data 0x2782064/0x2836000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266128 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb17925a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 27901952 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118661120 unmapped: 19718144 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118661120 unmapped: 19718144 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118693888 unmapped: 19685376 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118693888 unmapped: 19685376 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1354209 data_alloc: 234881024 data_used: 24870912
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118726656 unmapped: 19652608 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.768112183s of 10.846858978s, submitted: 112
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 19644416 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 19644416 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 19644416 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 19628032 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355737 data_alloc: 234881024 data_used: 24866816
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 123494400 unmapped: 14884864 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f832a000/0x0/0x4ffc00000, data 0x328d087/0x3342000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125542400 unmapped: 12836864 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8268000/0x0/0x4ffc00000, data 0x3346087/0x33fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125575168 unmapped: 12804096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8268000/0x0/0x4ffc00000, data 0x3346087/0x33fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1461459 data_alloc: 234881024 data_used: 25677824
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.644487381s of 10.730033875s, submitted: 144
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124919808 unmapped: 13459456 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f824d000/0x0/0x4ffc00000, data 0x336a087/0x341f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1454048 data_alloc: 234881024 data_used: 25743360
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f824d000/0x0/0x4ffc00000, data 0x336a087/0x341f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 13418496 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 13352960 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1454056 data_alloc: 234881024 data_used: 25743360
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6000 session 0x55bb17939c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51400 session 0x55bb172f6000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125575168 unmapped: 12804096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7854000/0x0/0x4ffc00000, data 0x3d620e9/0x3e18000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0800 session 0x55bb17999a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0800 session 0x55bb17924960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7854000/0x0/0x4ffc00000, data 0x3d620e9/0x3e18000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb16395a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.622238159s of 10.764037132s, submitted: 213
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6000 session 0x55bb1796f0e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 12713984 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 12689408 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133931008 unmapped: 4448256 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1600375 data_alloc: 251658240 data_used: 34828288
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133931008 unmapped: 4448256 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133931008 unmapped: 4448256 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7850000/0x0/0x4ffc00000, data 0x3d6510c/0x3e1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b8800 session 0x55bb17998f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1600751 data_alloc: 251658240 data_used: 34832384
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7850000/0x0/0x4ffc00000, data 0x3d6510c/0x3e1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.880483627s of 10.891222000s, submitted: 13
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137666560 unmapped: 5226496 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f68c8000/0x0/0x4ffc00000, data 0x4ce710c/0x4d9e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138551296 unmapped: 4341760 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1734917 data_alloc: 251658240 data_used: 36098048
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138633216 unmapped: 4259840 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f000 session 0x55bb1796ef00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138633216 unmapped: 4259840 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f682c000/0x0/0x4ffc00000, data 0x4d7a10c/0x4e31000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 4947968 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 4939776 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 4939776 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1728837 data_alloc: 251658240 data_used: 36098048
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 4964352 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f681b000/0x0/0x4ffc00000, data 0x4d9a10c/0x4e51000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 4964352 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb16394b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51400 session 0x55bb176cd2c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130326528 unmapped: 12566528 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839800 session 0x55bb149b94a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 12525568 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f823e000/0x0/0x4ffc00000, data 0x3378087/0x342d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 12525568 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1470835 data_alloc: 234881024 data_used: 25735168
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 12443648 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d400 session 0x55bb1796e5a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.900511742s of 13.028066635s, submitted: 225
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0400 session 0x55bb176cd0e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120381440 unmapped: 22511616 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120381440 unmapped: 22511616 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120381440 unmapped: 22511616 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 22462464 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193777 data_alloc: 234881024 data_used: 12009472
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 22462464 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 22462464 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195289 data_alloc: 234881024 data_used: 12009472
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.551191330s of 11.578843117s, submitted: 39
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195141 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120479744 unmapped: 22413312 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195009 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c2400 session 0x55bb17924b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad9400 session 0x55bb16526780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb163f3e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51000 session 0x55bb179d9a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb146ad680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb17924d20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f000 session 0x55bb179990e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c2400 session 0x55bb179d85a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad9400 session 0x55bb17360960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51000 session 0x55bb17924f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.872282982s of 10.916283607s, submitted: 41
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb179241e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899e000/0x0/0x4ffc00000, data 0x2c190c6/0x2cce000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121348096 unmapped: 38412288 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1332015 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1459543 data_alloc: 251658240 data_used: 30892032
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129761280 unmapped: 29999104 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129794048 unmapped: 29966336 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129802240 unmapped: 29958144 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129802240 unmapped: 29958144 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.577187538s of 10.583614349s, submitted: 8
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [2])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137314304 unmapped: 22446080 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1542929 data_alloc: 251658240 data_used: 31178752
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137576448 unmapped: 22183936 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e31000/0x0/0x4ffc00000, data 0x37850d6/0x383b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e31000/0x0/0x4ffc00000, data 0x37850d6/0x383b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1556209 data_alloc: 251658240 data_used: 32387072
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e12000/0x0/0x4ffc00000, data 0x37a40d6/0x385a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1555001 data_alloc: 251658240 data_used: 32391168
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e12000/0x0/0x4ffc00000, data 0x37a40d6/0x385a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0400 session 0x55bb17995e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.999485970s of 12.082794189s, submitted: 137
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139108352 unmapped: 20652032 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139108352 unmapped: 20652032 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139108352 unmapped: 20652032 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e05000/0x0/0x4ffc00000, data 0x37b10d6/0x3867000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1555305 data_alloc: 251658240 data_used: 32391168
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e05000/0x0/0x4ffc00000, data 0x37b10d6/0x3867000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1556037 data_alloc: 251658240 data_used: 32399360
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e00000/0x0/0x4ffc00000, data 0x37b60d6/0x386c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.517280579s of 10.523816109s, submitted: 7
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139239424 unmapped: 20520960 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb17a1c400 session 0x55bb179952c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4c00 session 0x55bb179d9680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb163f3680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7df2000/0x0/0x4ffc00000, data 0x37c40d6/0x387a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb13e82f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0400 session 0x55bb13e82960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4c00 session 0x55bb179250e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb17a1c400 session 0x55bb179254a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4800 session 0x55bb176cd860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4800 session 0x55bb163985a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139501568 unmapped: 20258816 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139501568 unmapped: 20258816 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1622344 data_alloc: 251658240 data_used: 32395264
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7592000/0x0/0x4ffc00000, data 0x4022148/0x40da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139616256 unmapped: 20144128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139616256 unmapped: 20144128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be800 session 0x55bb165270e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c1400 session 0x55bb176cc780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139616256 unmapped: 20144128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bac00 session 0x55bb17925a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33c00 session 0x55bb176ccf00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139632640 unmapped: 20127744 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144302080 unmapped: 15458304 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1668826 data_alloc: 251658240 data_used: 37187584
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7591000/0x0/0x4ffc00000, data 0x402216b/0x40db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f758f000/0x0/0x4ffc00000, data 0x402316b/0x40dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.890444756s of 11.935307503s, submitted: 51
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1677218 data_alloc: 251658240 data_used: 37236736
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144441344 unmapped: 15319040 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f758c000/0x0/0x4ffc00000, data 0x402716b/0x40e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147808256 unmapped: 11952128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1772648 data_alloc: 251658240 data_used: 37298176
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f6573000/0x0/0x4ffc00000, data 0x4c2f16b/0x4ce8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.307511330s of 10.372504234s, submitted: 103
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147324928 unmapped: 12435456 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1768056 data_alloc: 251658240 data_used: 37302272
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147324928 unmapped: 12435456 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f6572000/0x0/0x4ffc00000, data 0x4c3116b/0x4cea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1768432 data_alloc: 251658240 data_used: 37302272
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f656d000/0x0/0x4ffc00000, data 0x4c3616b/0x4cef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f656c000/0x0/0x4ffc00000, data 0x4c3716b/0x4cf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147365888 unmapped: 12394496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33c00 session 0x55bb149b8960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bac00 session 0x55bb146ac780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f6569000/0x0/0x4ffc00000, data 0x4c3a16b/0x4cf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bf800 session 0x55bb16922f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1570754 data_alloc: 251658240 data_used: 29581312
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f79ce000/0x0/0x4ffc00000, data 0x37d60d6/0x388c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f79ce000/0x0/0x4ffc00000, data 0x37d60d6/0x388c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.750416756s of 13.788337708s, submitted: 53
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e800 session 0x55bb16398d20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f000 session 0x55bb13e83a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f79cd000/0x0/0x4ffc00000, data 0x37d90d6/0x388f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1570022 data_alloc: 251658240 data_used: 29581312
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bac00 session 0x55bb179385a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224037 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224037 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224037 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.484128952s of 19.514802933s, submitted: 43
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd800 session 0x55bb1a07d860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282737 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9187000/0x0/0x4ffc00000, data 0x2021064/0x20d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16a5a800 session 0x55bb152c1860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128245760 unmapped: 31514624 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9163000/0x0/0x4ffc00000, data 0x2045064/0x20f9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319037 data_alloc: 234881024 data_used: 17121280
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb17995680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9c00 session 0x55bb179d83c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 29917184 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb17994f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231281 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231281 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.315660477s of 20.341112137s, submitted: 34
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f800 session 0x55bb146ab2c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b9400 session 0x55bb17360b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f924f000/0x0/0x4ffc00000, data 0x1f580c6/0x200d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280686 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c9d800 session 0x55bb176ccf00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127565824 unmapped: 32194560 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f924e000/0x0/0x4ffc00000, data 0x1f580e9/0x200e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 30883840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320451 data_alloc: 234881024 data_used: 17633280
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 30883840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b8c00 session 0x55bb176cd860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb179383c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1238459 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1238459 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f000 session 0x55bb179f3680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1238459 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1783f800 session 0x55bb179f32c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30400 session 0x55bb179f3860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51400 session 0x55bb179f23c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb179f3e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.602361679s of 26.647577286s, submitted: 61
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f000 session 0x55bb179f2000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30400 session 0x55bb13e82960
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1783f800 session 0x55bb13e830e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e000 session 0x55bb146aad20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb163f3a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352110 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16a5ac00 session 0x55bb179994a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a49000/0x0/0x4ffc00000, data 0x275d0d6/0x2813000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5800 session 0x55bb147a21e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb16527860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31000 session 0x55bb17939c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128393216 unmapped: 34521088 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128393216 unmapped: 34521088 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355610 data_alloc: 234881024 data_used: 12021760
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [2])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442402 data_alloc: 234881024 data_used: 24817664
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.114472389s of 14.163391113s, submitted: 61
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 140820480 unmapped: 22093824 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 140992512 unmapped: 21921792 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532334 data_alloc: 234881024 data_used: 25493504
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141041664 unmapped: 21872640 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141041664 unmapped: 21872640 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141041664 unmapped: 21872640 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7ece000/0x0/0x4ffc00000, data 0x32d0109/0x3388000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1534478 data_alloc: 234881024 data_used: 25710592
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7ed4000/0x0/0x4ffc00000, data 0x32d0109/0x3388000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1531874 data_alloc: 234881024 data_used: 25718784
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.299962044s of 13.376144409s, submitted: 148
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb163f2f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5800 session 0x55bb176cc5a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e400 session 0x55bb17998b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256161 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256161 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256161 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb17995860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5400 session 0x55bb1a07d4a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb179254a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb163943c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.407205582s of 15.435142517s, submitted: 51
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5800 session 0x55bb163f3680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e400 session 0x55bb16863a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c9cc00 session 0x55bb179954a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb152c1860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb13e82780
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f92a1000/0x0/0x4ffc00000, data 0x1f07064/0x1fbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299362 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb1691be00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f927d000/0x0/0x4ffc00000, data 0x1f2b064/0x1fdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1340147 data_alloc: 234881024 data_used: 17301504
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f927d000/0x0/0x4ffc00000, data 0x1f2b064/0x1fdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1340147 data_alloc: 234881024 data_used: 17301504
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb179994a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bb000 session 0x55bb17998b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bb000 session 0x55bb17998f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb179f3860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.089474678s of 13.115797043s, submitted: 25
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb179f3e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb152c1e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb156f81e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb164452c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb179d8d20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132636672 unmapped: 30277632 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f927d000/0x0/0x4ffc00000, data 0x1f2b064/0x1fdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136314880 unmapped: 26599424 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb176cd860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134701056 unmapped: 28213248 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bb000 session 0x55bb16445680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb16547680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb179f2b40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8303000/0x0/0x4ffc00000, data 0x2ea30d6/0x2f59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134709248 unmapped: 28205056 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8301000/0x0/0x4ffc00000, data 0x2ea3109/0x2f5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134717440 unmapped: 28196864 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1477652 data_alloc: 234881024 data_used: 17694720
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134217728 unmapped: 28696576 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134217728 unmapped: 28696576 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134217728 unmapped: 28696576 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 28688384 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 28688384 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1506156 data_alloc: 234881024 data_used: 21958656
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82ff000/0x0/0x4ffc00000, data 0x2ea5109/0x2f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82ff000/0x0/0x4ffc00000, data 0x2ea5109/0x2f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.724095345s of 13.820690155s, submitted: 143
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 140476416 unmapped: 22437888 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1613132 data_alloc: 234881024 data_used: 22204416
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141090816 unmapped: 21823488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624194 data_alloc: 234881024 data_used: 22216704
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141090816 unmapped: 21823488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141090816 unmapped: 21823488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141123584 unmapped: 21790720 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141123584 unmapped: 21790720 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141123584 unmapped: 21790720 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624210 data_alloc: 234881024 data_used: 22216704
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141131776 unmapped: 21782528 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141131776 unmapped: 21782528 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141139968 unmapped: 21774336 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141139968 unmapped: 21774336 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141139968 unmapped: 21774336 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624210 data_alloc: 234881024 data_used: 22216704
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.310153961s of 16.393232346s, submitted: 138
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb179f21e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb179943c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141164544 unmapped: 21749760 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb179f2d20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7354000/0x0/0x4ffc00000, data 0x2969064/0x2a1d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1441971 data_alloc: 234881024 data_used: 17686528
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b9800 session 0x55bb1a07d860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bf400 session 0x55bb1796f860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb176cc1e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f80ad000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286343 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f80ad000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286343 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30400 session 0x55bb155605a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be800 session 0x55bb17995860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb156dd800 session 0x55bb155601e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f000 session 0x55bb16399e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f80ad000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.128541946s of 19.187244415s, submitted: 97
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb14672800 session 0x55bb163f23c0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb14a98c00 session 0x55bb1a1145a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad9c00 session 0x55bb17938f00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136527872 unmapped: 33210368 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386118 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d800 session 0x55bb16398d20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d800 session 0x55bb1a07c000
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f793e000/0x0/0x4ffc00000, data 0x26c9074/0x277e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f793e000/0x0/0x4ffc00000, data 0x26c9074/0x277e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f793e000/0x0/0x4ffc00000, data 0x26c9074/0x277e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1385894 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31400 session 0x55bb1691b860
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bc000 session 0x55bb173614a0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6400 session 0x55bb179f3a40
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb16923e00
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136814592 unmapped: 32923648 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1483305 data_alloc: 234881024 data_used: 25530368
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1483305 data_alloc: 234881024 data_used: 25530368
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.360231400s of 16.395832062s, submitted: 35
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 145932288 unmapped: 23805952 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146391040 unmapped: 23347200 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146391040 unmapped: 23347200 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1553735 data_alloc: 234881024 data_used: 25776128
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f5fd5000/0x0/0x4ffc00000, data 0x2e72084/0x2f28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1553735 data_alloc: 234881024 data_used: 25776128
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread fragmentation_score=0.000363 took=0.000023s
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f5fd5000/0x0/0x4ffc00000, data 0x2e72084/0x2f28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1553735 data_alloc: 234881024 data_used: 25776128
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.504435539s of 14.553052902s, submitted: 89
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb176cd0e0
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d800 session 0x55bb17925680
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146464768 unmapped: 23273472 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e400 session 0x55bb147a3c20
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 31711232 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 31711232 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138067968 unmapped: 31670272 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138076160 unmapped: 31662080 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'config diff' '{prefix=config diff}'
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'config show' '{prefix=config show}'
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'counter dump' '{prefix=counter dump}'
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137732096 unmapped: 32006144 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'counter schema' '{prefix=counter schema}'
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 32415744 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:01:35 np0005591762 ceph-osd[77912]: do_command 'log dump' '{prefix=log dump}'
Jan 22 05:01:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:35.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:35 np0005591762 nova_compute[225313]: 2026-01-22 10:01:35.338 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3425818797' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3964776681' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 22 05:01:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3553518347' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 22 05:01:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/749701509' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 22 05:01:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 22 05:01:36 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3450772711' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 22 05:01:36 np0005591762 nova_compute[225313]: 2026-01-22 10:01:36.667 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 22 05:01:36 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3513102080' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1077777603' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 22 05:01:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:37.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3547016935' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/303413048' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 22 05:01:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1563924601' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/384014986' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:01:37 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4099818546' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3606805641' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 22 05:01:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:38.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2772059257' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/429786084' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 22 05:01:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/121172981' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:01:38 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:01:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 22 05:01:39 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3521829757' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 22 05:01:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 22 05:01:39 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4131391769' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 22 05:01:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:39.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:39 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 22 05:01:39 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2075691075' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 22 05:01:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:39 np0005591762 systemd[1]: Starting Hostname Service...
Jan 22 05:01:39 np0005591762 systemd[1]: Started Hostname Service.
Jan 22 05:01:40 np0005591762 nova_compute[225313]: 2026-01-22 10:01:40.339 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:40.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:40 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 22 05:01:40 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2633622080' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 22 05:01:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:40 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 22 05:01:40 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2017689314' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 22 05:01:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:41.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 22 05:01:41 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1905000279' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 22 05:01:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 22 05:01:41 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1970227719' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 22 05:01:41 np0005591762 nova_compute[225313]: 2026-01-22 10:01:41.668 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2513604521' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 22 05:01:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:42.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 22 05:01:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 22 05:01:42 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/872554858' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 22 05:01:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:43.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:43 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 22 05:01:43 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4147138334' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 22 05:01:43 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 22 05:01:43 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1303062398' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 22 05:01:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:43 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 22 05:01:43 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2730071197' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 22 05:01:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:01:44 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:01:44 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 22 05:01:44 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3933713589' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 22 05:01:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:44.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:44 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 22 05:01:44 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3547347452' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 22 05:01:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:45 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 22 05:01:45 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1614385348' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 22 05:01:45 np0005591762 nova_compute[225313]: 2026-01-22 10:01:45.340 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:45.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:45 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 22 05:01:45 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1101024046' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 22 05:01:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:45 np0005591762 podman[238850]: 2026-01-22 10:01:45.824064158 +0000 UTC m=+0.047444372 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/120828417' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 22 05:01:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:46.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:46 np0005591762 nova_compute[225313]: 2026-01-22 10:01:46.668 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 22 05:01:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 22 05:01:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:01:47.205 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:01:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:01:47.205 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:01:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:01:47.205 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:01:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 22 05:01:47 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/911480340' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 22 05:01:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:47.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:48 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 22 05:01:48 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1784093012' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 22 05:01:48 np0005591762 ovs-appctl[239985]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 05:01:48 np0005591762 ovs-appctl[239993]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 05:01:48 np0005591762 ovs-appctl[240002]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 05:01:48 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 22 05:01:48 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1347363234' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 22 05:01:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:48.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:49.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:50 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 22 05:01:50 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2007539496' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 22 05:01:50 np0005591762 nova_compute[225313]: 2026-01-22 10:01:50.341 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:50 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 22 05:01:50 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2833480809' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 22 05:01:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:50.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:50 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 22 05:01:50 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1555913216' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 22 05:01:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:51.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Jan 22 05:01:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/340673463' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 22 05:01:51 np0005591762 nova_compute[225313]: 2026-01-22 10:01:51.670 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 22 05:01:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/150652646' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 22 05:01:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:52.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 22 05:01:52 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1200031031' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 22 05:01:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:53.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:53 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 22 05:01:53 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/137589296' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 22 05:01:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:53 np0005591762 podman[241671]: 2026-01-22 10:01:53.812478662 +0000 UTC m=+0.116433957 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 05:01:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:54.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:54 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 22 05:01:54 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/288908141' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 22 05:01:54 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 22 05:01:54 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3485189337' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 22 05:01:54 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 22 05:01:54 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1188161649' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 22 05:01:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:55 np0005591762 nova_compute[225313]: 2026-01-22 10:01:55.342 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:55.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:55 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 22 05:01:55 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/916363926' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 22 05:01:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:55 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 22 05:01:55 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1483851601' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 22 05:01:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:56.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:56 np0005591762 nova_compute[225313]: 2026-01-22 10:01:56.672 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:01:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 22 05:01:56 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/82018234' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 22 05:01:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:01:57 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 05:01:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 22 05:01:57 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2292640990' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 22 05:01:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:57.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:57 np0005591762 systemd[1]: Starting Time & Date Service...
Jan 22 05:01:57 np0005591762 systemd[1]: Started Time & Date Service.
Jan 22 05:01:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:01:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:01:58.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:01:58 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 22 05:01:58 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4263213212' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 22 05:01:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:01:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:01:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:01:59.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:01:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:01:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:01:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:01:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:00 np0005591762 nova_compute[225313]: 2026-01-22 10:02:00.343 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:02:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:02:00.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:00 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 22 05:02:00 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1710758159' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 22 05:02:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:02:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:02:01.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:02:01 np0005591762 nova_compute[225313]: 2026-01-22 10:02:01.674 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:02:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:01 np0005591762 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 22 05:02:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:02:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:02:02.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:02:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:02:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:02:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:02:04.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:05 np0005591762 nova_compute[225313]: 2026-01-22 10:02:05.344 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:02:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:02:05.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:02:06.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:06 np0005591762 nova_compute[225313]: 2026-01-22 10:02:06.675 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:02:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:02:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:02:07.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:02:08.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:02:09.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:59 np0005591762 rsyslogd[963]: imjournal: 453 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 22 05:02:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:02:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:02:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:02:59.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:02:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:02:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:02:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:02:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:00 np0005591762 nova_compute[225313]: 2026-01-22 10:03:00.372 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:00.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:01.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:01 np0005591762 nova_compute[225313]: 2026-01-22 10:03:01.694 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:02.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:03.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:03:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:04.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:03:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:05 np0005591762 nova_compute[225313]: 2026-01-22 10:03:05.374 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:05.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:06.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:06 np0005591762 nova_compute[225313]: 2026-01-22 10:03:06.696 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:07.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:08.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:09.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:10 np0005591762 nova_compute[225313]: 2026-01-22 10:03:10.376 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:10.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:10 np0005591762 nova_compute[225313]: 2026-01-22 10:03:10.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:10 np0005591762 nova_compute[225313]: 2026-01-22 10:03:10.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 05:03:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:11.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:11 np0005591762 nova_compute[225313]: 2026-01-22 10:03:11.699 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:12.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:13.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:14.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:15 np0005591762 nova_compute[225313]: 2026-01-22 10:03:15.376 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:15.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:16.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:16 np0005591762 nova_compute[225313]: 2026-01-22 10:03:16.701 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:16 np0005591762 nova_compute[225313]: 2026-01-22 10:03:16.733 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:17.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:17 np0005591762 nova_compute[225313]: 2026-01-22 10:03:17.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:17 np0005591762 nova_compute[225313]: 2026-01-22 10:03:17.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:03:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:18.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:18 np0005591762 nova_compute[225313]: 2026-01-22 10:03:18.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:18 np0005591762 nova_compute[225313]: 2026-01-22 10:03:18.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:03:18 np0005591762 nova_compute[225313]: 2026-01-22 10:03:18.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:03:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:18 np0005591762 nova_compute[225313]: 2026-01-22 10:03:18.735 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:03:18 np0005591762 nova_compute[225313]: 2026-01-22 10:03:18.735 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:18 np0005591762 podman[242978]: 2026-01-22 10:03:18.817945982 +0000 UTC m=+0.041162576 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 05:03:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:19.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:19 np0005591762 nova_compute[225313]: 2026-01-22 10:03:19.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:19 np0005591762 nova_compute[225313]: 2026-01-22 10:03:19.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:20 np0005591762 nova_compute[225313]: 2026-01-22 10:03:20.378 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:20.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:03:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:21.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:03:21 np0005591762 nova_compute[225313]: 2026-01-22 10:03:21.703 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:21 np0005591762 nova_compute[225313]: 2026-01-22 10:03:21.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:21 np0005591762 nova_compute[225313]: 2026-01-22 10:03:21.724 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 05:03:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:21 np0005591762 nova_compute[225313]: 2026-01-22 10:03:21.741 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 05:03:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:22.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:22 np0005591762 nova_compute[225313]: 2026-01-22 10:03:22.741 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:23.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:23 np0005591762 nova_compute[225313]: 2026-01-22 10:03:23.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:24.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.739 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.739 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.768 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.769 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.769 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.769 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:03:24 np0005591762 nova_compute[225313]: 2026-01-22 10:03:24.769 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:03:25 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:03:25 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/609864197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.107 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.313 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.314 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4844MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.314 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.315 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.379 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:25.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.463 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.463 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.544 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing inventories for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.626 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating ProviderTree inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.627 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.638 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing aggregate associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.656 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing trait associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, traits: HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX512VAES,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AESNI,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 05:03:25 np0005591762 nova_compute[225313]: 2026-01-22 10:03:25.672 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:03:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:25 np0005591762 podman[243034]: 2026-01-22 10:03:25.844881929 +0000 UTC m=+0.070546985 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 05:03:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:03:26 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/844724641' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:03:26 np0005591762 nova_compute[225313]: 2026-01-22 10:03:26.026 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:03:26 np0005591762 nova_compute[225313]: 2026-01-22 10:03:26.030 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:03:26 np0005591762 nova_compute[225313]: 2026-01-22 10:03:26.045 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:03:26 np0005591762 nova_compute[225313]: 2026-01-22 10:03:26.046 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:03:26 np0005591762 nova_compute[225313]: 2026-01-22 10:03:26.047 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:03:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:26.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:26 np0005591762 nova_compute[225313]: 2026-01-22 10:03:26.704 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:27.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:28.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:29.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:30 np0005591762 nova_compute[225313]: 2026-01-22 10:03:30.381 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:30.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:31.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:31 np0005591762 nova_compute[225313]: 2026-01-22 10:03:31.707 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:32.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:33.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:34.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:35 np0005591762 nova_compute[225313]: 2026-01-22 10:03:35.382 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:35.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:36.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:36 np0005591762 nova_compute[225313]: 2026-01-22 10:03:36.708 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:37.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:38.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:39.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:40 np0005591762 nova_compute[225313]: 2026-01-22 10:03:40.384 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:40.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:41.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:41 np0005591762 nova_compute[225313]: 2026-01-22 10:03:41.710 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:42.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:43.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:44.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:45 np0005591762 nova_compute[225313]: 2026-01-22 10:03:45.385 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:45.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:46.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:46 np0005591762 nova_compute[225313]: 2026-01-22 10:03:46.712 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:03:47.207 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:03:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:03:47.207 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:03:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:03:47.207 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:03:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:47.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:48.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:49.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:49 np0005591762 podman[243196]: 2026-01-22 10:03:49.816931709 +0000 UTC m=+0.040171859 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:49 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:03:50 np0005591762 nova_compute[225313]: 2026-01-22 10:03:50.387 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:50.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/361757237' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/361757237' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:03:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:51.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:51 np0005591762 nova_compute[225313]: 2026-01-22 10:03:51.714 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5684 writes, 29K keys, 5684 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5684 writes, 5684 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1584 writes, 7917 keys, 1584 commit groups, 1.0 writes per commit group, ingest: 17.80 MB, 0.03 MB/s#012Interval WAL: 1584 writes, 1584 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    444.8      0.10              0.07        15    0.007       0      0       0.0       0.0#012  L6      1/0   13.47 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.0    513.7    436.9      0.42              0.29        14    0.030     74K   7499       0.0       0.0#012 Sum      1/0   13.47 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.0    412.4    438.5      0.52              0.36        29    0.018     74K   7499       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.8    397.9    405.4      0.20              0.14        10    0.020     31K   2592       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    513.7    436.9      0.42              0.29        14    0.030     74K   7499       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    450.0      0.10              0.07        14    0.007       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.044, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.22 GB write, 0.13 MB/s write, 0.21 GB read, 0.12 MB/s read, 0.5 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a025f49350#2 capacity: 304.00 MB usage: 17.47 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000152 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1141,16.87 MB,5.54968%) FilterBlock(29,216.23 KB,0.0694626%) IndexBlock(29,393.92 KB,0.126543%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 22 05:03:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:52.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:53.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:03:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:55 np0005591762 nova_compute[225313]: 2026-01-22 10:03:55.389 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:55.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:56.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:56 np0005591762 nova_compute[225313]: 2026-01-22 10:03:56.714 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:03:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:56 np0005591762 podman[243269]: 2026-01-22 10:03:56.830242046 +0000 UTC m=+0.054282946 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 22 05:03:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:03:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:57.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:03:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:03:58.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:03:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:03:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:03:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:03:59.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:03:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:03:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:03:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:03:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:00 np0005591762 nova_compute[225313]: 2026-01-22 10:04:00.391 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:00.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:01.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:01 np0005591762 nova_compute[225313]: 2026-01-22 10:04:01.715 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:02.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:03.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:04.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:05 np0005591762 nova_compute[225313]: 2026-01-22 10:04:05.393 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:05.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:06.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:06 np0005591762 nova_compute[225313]: 2026-01-22 10:04:06.716 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:07.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:08.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:09.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:10 np0005591762 nova_compute[225313]: 2026-01-22 10:04:10.395 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:10.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:11.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:11 np0005591762 nova_compute[225313]: 2026-01-22 10:04:11.718 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:12.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:13.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:14.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:15 np0005591762 nova_compute[225313]: 2026-01-22 10:04:15.397 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:16.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:16 np0005591762 nova_compute[225313]: 2026-01-22 10:04:16.720 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:04:16 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 44K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 3600 syncs, 3.26 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3744 writes, 13K keys, 3744 commit groups, 1.0 writes per commit group, ingest: 15.38 MB, 0.03 MB/s#012Interval WAL: 3744 writes, 1626 syncs, 2.30 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 22 05:04:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.026 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.027 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.046 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.047 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:04:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:18.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:04:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:18 np0005591762 nova_compute[225313]: 2026-01-22 10:04:18.737 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:04:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:19.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:19 np0005591762 nova_compute[225313]: 2026-01-22 10:04:19.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:19 np0005591762 nova_compute[225313]: 2026-01-22 10:04:19.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:20 np0005591762 nova_compute[225313]: 2026-01-22 10:04:20.400 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:20.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:20 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:04:20 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/675895157' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:04:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:20 np0005591762 nova_compute[225313]: 2026-01-22 10:04:20.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:20 np0005591762 podman[243341]: 2026-01-22 10:04:20.813037542 +0000 UTC m=+0.036921710 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 05:04:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:21.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:21 np0005591762 nova_compute[225313]: 2026-01-22 10:04:21.723 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:22.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:23.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:23 np0005591762 nova_compute[225313]: 2026-01-22 10:04:23.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:24.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:24 np0005591762 nova_compute[225313]: 2026-01-22 10:04:24.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:25 np0005591762 nova_compute[225313]: 2026-01-22 10:04:25.402 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:25.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:04:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:26.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:04:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.724 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.738 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.739 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:04:26 np0005591762 nova_compute[225313]: 2026-01-22 10:04:26.739 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:04:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:04:27 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/458429823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.095 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.356s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.288 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.289 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4846MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.290 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.290 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.360 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.360 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.382 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:04:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:27.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:04:27 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4238124704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:04:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.724 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.728 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:04:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.738 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.739 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:04:27 np0005591762 nova_compute[225313]: 2026-01-22 10:04:27.740 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:04:27 np0005591762 podman[243409]: 2026-01-22 10:04:27.830354152 +0000 UTC m=+0.053490654 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 05:04:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:28.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:29.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:30 np0005591762 nova_compute[225313]: 2026-01-22 10:04:30.403 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:30.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:31.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:31 np0005591762 nova_compute[225313]: 2026-01-22 10:04:31.725 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:32.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:04:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:34.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:04:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:35 np0005591762 nova_compute[225313]: 2026-01-22 10:04:35.404 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:36.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:36 np0005591762 nova_compute[225313]: 2026-01-22 10:04:36.727 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:37.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.558398) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076278558465, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1576, "num_deletes": 256, "total_data_size": 3986949, "memory_usage": 4050400, "flush_reason": "Manual Compaction"}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076278564326, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2586186, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29054, "largest_seqno": 30625, "table_properties": {"data_size": 2579630, "index_size": 3691, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13549, "raw_average_key_size": 19, "raw_value_size": 2566380, "raw_average_value_size": 3655, "num_data_blocks": 163, "num_entries": 702, "num_filter_entries": 702, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769076145, "oldest_key_time": 1769076145, "file_creation_time": 1769076278, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 5927 microseconds, and 4132 cpu microseconds.
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:04:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.564357) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2586186 bytes OK
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.564373) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.564832) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.564845) EVENT_LOG_v1 {"time_micros": 1769076278564842, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.564857) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3979693, prev total WAL file size 3979693, number of live WAL files 2.
Jan 22 05:04:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:38.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.565602) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353030' seq:72057594037927935, type:22 .. '6C6F676D00373532' seq:0, type:0; will stop at (end)
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2525KB)], [54(13MB)]
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076278565623, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16706171, "oldest_snapshot_seqno": -1}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6499 keys, 16565503 bytes, temperature: kUnknown
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076278601932, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16565503, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16520189, "index_size": 27988, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16261, "raw_key_size": 166135, "raw_average_key_size": 25, "raw_value_size": 16401232, "raw_average_value_size": 2523, "num_data_blocks": 1141, "num_entries": 6499, "num_filter_entries": 6499, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769076278, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.602149) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16565503 bytes
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.602571) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 459.3 rd, 455.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 13.5 +0.0 blob) out(15.8 +0.0 blob), read-write-amplify(12.9) write-amplify(6.4) OK, records in: 7027, records dropped: 528 output_compression: NoCompression
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.602589) EVENT_LOG_v1 {"time_micros": 1769076278602579, "job": 32, "event": "compaction_finished", "compaction_time_micros": 36371, "compaction_time_cpu_micros": 23025, "output_level": 6, "num_output_files": 1, "total_output_size": 16565503, "num_input_records": 7027, "num_output_records": 6499, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076278602944, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076278604744, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.565556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.604771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.604775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.604776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.604777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:38 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:38.604778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:39.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:40 np0005591762 nova_compute[225313]: 2026-01-22 10:04:40.405 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:40.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:41.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:41 np0005591762 nova_compute[225313]: 2026-01-22 10:04:41.729 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:42.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:43.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:44.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:45 np0005591762 nova_compute[225313]: 2026-01-22 10:04:45.408 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:45.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:46.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:46 np0005591762 nova_compute[225313]: 2026-01-22 10:04:46.730 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:04:47.208 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:04:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:04:47.208 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:04:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:04:47.208 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:04:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:47.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:48.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:49.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:50 np0005591762 nova_compute[225313]: 2026-01-22 10:04:50.408 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:50.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:04:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1063796805' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:04:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:04:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1063796805' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:04:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:51.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:51 np0005591762 nova_compute[225313]: 2026-01-22 10:04:51.732 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:51 np0005591762 podman[243482]: 2026-01-22 10:04:51.815238045 +0000 UTC m=+0.038811003 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 05:04:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:52.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.072680819 +0000 UTC m=+0.029131922 container create cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 22 05:04:54 np0005591762 systemd[1]: Started libpod-conmon-cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3.scope.
Jan 22 05:04:54 np0005591762 systemd[1]: Started libcrun container.
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.132195699 +0000 UTC m=+0.088646813 container init cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_hofstadter, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.137281208 +0000 UTC m=+0.093732312 container start cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_hofstadter, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.138488083 +0000 UTC m=+0.094939187 container attach cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 22 05:04:54 np0005591762 recursing_hofstadter[243674]: 167 167
Jan 22 05:04:54 np0005591762 systemd[1]: libpod-cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3.scope: Deactivated successfully.
Jan 22 05:04:54 np0005591762 conmon[243674]: conmon cf7d7174a94f479f048c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3.scope/container/memory.events
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.142592743 +0000 UTC m=+0.099043838 container died cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_hofstadter, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 22 05:04:54 np0005591762 systemd[1]: var-lib-containers-storage-overlay-9d8187d4bec1164fa8b8ad96eafde772a121c7936fe8ce03d084e84f553253d6-merged.mount: Deactivated successfully.
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.061318614 +0000 UTC m=+0.017769737 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 05:04:54 np0005591762 podman[243661]: 2026-01-22 10:04:54.165010755 +0000 UTC m=+0.121461858 container remove cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_hofstadter, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 22 05:04:54 np0005591762 systemd[1]: libpod-conmon-cf7d7174a94f479f048c963d1e5c2f7002091060cb5ae04892b8a4dc9bdbafa3.scope: Deactivated successfully.
Jan 22 05:04:54 np0005591762 podman[243721]: 2026-01-22 10:04:54.285054438 +0000 UTC m=+0.033003804 container create 0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_shaw, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 05:04:54 np0005591762 systemd[1]: Started libpod-conmon-0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734.scope.
Jan 22 05:04:54 np0005591762 systemd[1]: Started libcrun container.
Jan 22 05:04:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57683bcdbba38db308ff4b875aec55fe82f240b6cfb33493a655fcf110443e1d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 22 05:04:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57683bcdbba38db308ff4b875aec55fe82f240b6cfb33493a655fcf110443e1d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 22 05:04:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57683bcdbba38db308ff4b875aec55fe82f240b6cfb33493a655fcf110443e1d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 22 05:04:54 np0005591762 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57683bcdbba38db308ff4b875aec55fe82f240b6cfb33493a655fcf110443e1d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 22 05:04:54 np0005591762 podman[243721]: 2026-01-22 10:04:54.340622857 +0000 UTC m=+0.088572242 container init 0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_shaw, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 22 05:04:54 np0005591762 podman[243721]: 2026-01-22 10:04:54.346635244 +0000 UTC m=+0.094584600 container start 0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_shaw, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 22 05:04:54 np0005591762 podman[243721]: 2026-01-22 10:04:54.347807354 +0000 UTC m=+0.095756719 container attach 0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_shaw, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 22 05:04:54 np0005591762 podman[243721]: 2026-01-22 10:04:54.273288703 +0000 UTC m=+0.021238078 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 22 05:04:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:54 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:54 np0005591762 practical_shaw[243734]: [
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:    {
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "available": false,
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "being_replaced": false,
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "ceph_device_lvm": false,
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "lsm_data": {},
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "lvs": [],
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "path": "/dev/sr0",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "rejected_reasons": [
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "Has a FileSystem",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "Insufficient space (<5GB)"
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        ],
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        "sys_api": {
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "actuators": null,
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "device_nodes": [
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:                "sr0"
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            ],
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "devname": "sr0",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "human_readable_size": "474.00 KB",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "id_bus": "ata",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "model": "QEMU DVD-ROM",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "nr_requests": "64",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "parent": "/dev/sr0",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "partitions": {},
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "path": "/dev/sr0",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "removable": "1",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "rev": "2.5+",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "ro": "0",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "rotational": "1",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "sas_address": "",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "sas_device_handle": "",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "scheduler_mode": "mq-deadline",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "sectors": 0,
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "sectorsize": "2048",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "size": 485376.0,
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "support_discard": "2048",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "type": "disk",
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:            "vendor": "QEMU"
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:        }
Jan 22 05:04:54 np0005591762 practical_shaw[243734]:    }
Jan 22 05:04:54 np0005591762 practical_shaw[243734]: ]
Jan 22 05:04:54 np0005591762 systemd[1]: libpod-0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734.scope: Deactivated successfully.
Jan 22 05:04:54 np0005591762 podman[245089]: 2026-01-22 10:04:54.927369792 +0000 UTC m=+0.018862889 container died 0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_shaw, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 22 05:04:54 np0005591762 systemd[1]: var-lib-containers-storage-overlay-57683bcdbba38db308ff4b875aec55fe82f240b6cfb33493a655fcf110443e1d-merged.mount: Deactivated successfully.
Jan 22 05:04:54 np0005591762 podman[245089]: 2026-01-22 10:04:54.948775845 +0000 UTC m=+0.040268932 container remove 0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_shaw, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 22 05:04:54 np0005591762 systemd[1]: libpod-conmon-0e041b95aefd62fca896e0b548cb1a8cde9dd29b3c640d9b2eb129ed3136e734.scope: Deactivated successfully.
Jan 22 05:04:55 np0005591762 nova_compute[225313]: 2026-01-22 10:04:55.409 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:04:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:55.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:04:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:55 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:04:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:55 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:55 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:04:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:04:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:04:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:56 np0005591762 nova_compute[225313]: 2026-01-22 10:04:56.735 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:04:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:04:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:57.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:58 np0005591762 podman[245128]: 2026-01-22 10:04:58.449067123 +0000 UTC m=+0.063855185 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.568076) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076298568131, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 537, "num_deletes": 250, "total_data_size": 961497, "memory_usage": 972088, "flush_reason": "Manual Compaction"}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076298570817, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 616510, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30630, "largest_seqno": 31162, "table_properties": {"data_size": 613565, "index_size": 917, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6079, "raw_average_key_size": 16, "raw_value_size": 607755, "raw_average_value_size": 1683, "num_data_blocks": 38, "num_entries": 361, "num_filter_entries": 361, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769076278, "oldest_key_time": 1769076278, "file_creation_time": 1769076298, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 2754 microseconds, and 1857 cpu microseconds.
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.570844) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 616510 bytes OK
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.570854) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.571295) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.571304) EVENT_LOG_v1 {"time_micros": 1769076298571301, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.571315) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 958340, prev total WAL file size 958340, number of live WAL files 2.
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.571726) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323533' seq:72057594037927935, type:22 .. '6B7600353034' seq:0, type:0; will stop at (end)
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(602KB)], [57(15MB)]
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076298571752, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 17182013, "oldest_snapshot_seqno": -1}
Jan 22 05:04:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:04:58.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 6343 keys, 15941757 bytes, temperature: kUnknown
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076298610545, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 15941757, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15897691, "index_size": 27124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15877, "raw_key_size": 164549, "raw_average_key_size": 25, "raw_value_size": 15781498, "raw_average_value_size": 2488, "num_data_blocks": 1091, "num_entries": 6343, "num_filter_entries": 6343, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769076298, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.610691) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 15941757 bytes
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.611231) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 442.2 rd, 410.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 15.8 +0.0 blob) out(15.2 +0.0 blob), read-write-amplify(53.7) write-amplify(25.9) OK, records in: 6860, records dropped: 517 output_compression: NoCompression
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.611245) EVENT_LOG_v1 {"time_micros": 1769076298611239, "job": 34, "event": "compaction_finished", "compaction_time_micros": 38853, "compaction_time_cpu_micros": 24099, "output_level": 6, "num_output_files": 1, "total_output_size": 15941757, "num_input_records": 6860, "num_output_records": 6343, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076298611543, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076298613494, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.571652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.613578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.613582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.613584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.613585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:58 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:04:58.613586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:04:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:59 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:59 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:04:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:04:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:04:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:04:59.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:04:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:04:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:04:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:04:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:00 np0005591762 nova_compute[225313]: 2026-01-22 10:05:00.411 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:01.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:01 np0005591762 nova_compute[225313]: 2026-01-22 10:05:01.737 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:03.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:05 np0005591762 nova_compute[225313]: 2026-01-22 10:05:05.412 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:05.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:06.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:06 np0005591762 nova_compute[225313]: 2026-01-22 10:05:06.737 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:07.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:08.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:09.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:10 np0005591762 nova_compute[225313]: 2026-01-22 10:05:10.413 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:10.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:11.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:11 np0005591762 nova_compute[225313]: 2026-01-22 10:05:11.739 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:12.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:14.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:15 np0005591762 nova_compute[225313]: 2026-01-22 10:05:15.416 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:15.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:16.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:16 np0005591762 nova_compute[225313]: 2026-01-22 10:05:16.741 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:17.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:18.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.735 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.735 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.735 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.735 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:05:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.750 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.750 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:18 np0005591762 nova_compute[225313]: 2026-01-22 10:05:18.751 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:05:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:19.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:20 np0005591762 nova_compute[225313]: 2026-01-22 10:05:20.417 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:20 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:05:20 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1181781183' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:05:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:20.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:20 np0005591762 nova_compute[225313]: 2026-01-22 10:05:20.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:20 np0005591762 nova_compute[225313]: 2026-01-22 10:05:20.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:21.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:21 np0005591762 nova_compute[225313]: 2026-01-22 10:05:21.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:21 np0005591762 nova_compute[225313]: 2026-01-22 10:05:21.741 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:22.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:22 np0005591762 podman[245203]: 2026-01-22 10:05:22.815981171 +0000 UTC m=+0.039013765 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 05:05:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:23.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:24.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:24 np0005591762 nova_compute[225313]: 2026-01-22 10:05:24.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:25 np0005591762 nova_compute[225313]: 2026-01-22 10:05:25.419 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:05:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:25.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:05:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:26.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:26 np0005591762 nova_compute[225313]: 2026-01-22 10:05:26.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:26 np0005591762 nova_compute[225313]: 2026-01-22 10:05:26.743 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:27.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:28.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:28 np0005591762 nova_compute[225313]: 2026-01-22 10:05:28.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:05:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:28 np0005591762 nova_compute[225313]: 2026-01-22 10:05:28.737 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:05:28 np0005591762 nova_compute[225313]: 2026-01-22 10:05:28.737 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:05:28 np0005591762 nova_compute[225313]: 2026-01-22 10:05:28.737 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:05:28 np0005591762 nova_compute[225313]: 2026-01-22 10:05:28.737 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:05:28 np0005591762 nova_compute[225313]: 2026-01-22 10:05:28.738 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:05:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:28 np0005591762 podman[245227]: 2026-01-22 10:05:28.833232256 +0000 UTC m=+0.056310638 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.080 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.343s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.285 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.286 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4847MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.286 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.286 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.355 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.355 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.368 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:05:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:29.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:05:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3006463647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.718 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.722 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:05:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.733 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.735 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:05:29 np0005591762 nova_compute[225313]: 2026-01-22 10:05:29.735 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:05:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:30 np0005591762 nova_compute[225313]: 2026-01-22 10:05:30.420 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:30.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:05:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:31.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:05:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:31 np0005591762 nova_compute[225313]: 2026-01-22 10:05:31.744 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:32.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:33.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:35 np0005591762 nova_compute[225313]: 2026-01-22 10:05:35.423 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:35.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:36.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:36 np0005591762 nova_compute[225313]: 2026-01-22 10:05:36.747 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:05:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:37.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:05:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:38.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:39.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:40 np0005591762 nova_compute[225313]: 2026-01-22 10:05:40.424 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:05:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:40.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:05:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:41.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:41 np0005591762 nova_compute[225313]: 2026-01-22 10:05:41.748 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:42.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:43.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:44.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:45 np0005591762 nova_compute[225313]: 2026-01-22 10:05:45.424 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:46.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:46 np0005591762 nova_compute[225313]: 2026-01-22 10:05:46.748 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:05:47.208 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:05:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:05:47.210 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:05:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:05:47.210 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:05:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:47.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:48.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:49.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:50 np0005591762 nova_compute[225313]: 2026-01-22 10:05:50.425 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:50.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.801300) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076350801347, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 737, "num_deletes": 251, "total_data_size": 1387802, "memory_usage": 1408096, "flush_reason": "Manual Compaction"}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076350804766, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 912146, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31167, "largest_seqno": 31899, "table_properties": {"data_size": 908665, "index_size": 1325, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7960, "raw_average_key_size": 19, "raw_value_size": 901741, "raw_average_value_size": 2167, "num_data_blocks": 60, "num_entries": 416, "num_filter_entries": 416, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769076299, "oldest_key_time": 1769076299, "file_creation_time": 1769076350, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 3475 microseconds, and 2354 cpu microseconds.
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.804785) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 912146 bytes OK
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.804796) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.805312) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.805322) EVENT_LOG_v1 {"time_micros": 1769076350805319, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.805334) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 1383904, prev total WAL file size 1383904, number of live WAL files 2.
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.805688) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(890KB)], [60(15MB)]
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076350805704, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16853903, "oldest_snapshot_seqno": -1}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6248 keys, 14812469 bytes, temperature: kUnknown
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076350836057, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14812469, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14769873, "index_size": 25846, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 163231, "raw_average_key_size": 26, "raw_value_size": 14656224, "raw_average_value_size": 2345, "num_data_blocks": 1034, "num_entries": 6248, "num_filter_entries": 6248, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769076350, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.836186) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14812469 bytes
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.838700) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 554.7 rd, 487.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 15.2 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(34.7) write-amplify(16.2) OK, records in: 6759, records dropped: 511 output_compression: NoCompression
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.838713) EVENT_LOG_v1 {"time_micros": 1769076350838707, "job": 36, "event": "compaction_finished", "compaction_time_micros": 30386, "compaction_time_cpu_micros": 21747, "output_level": 6, "num_output_files": 1, "total_output_size": 14812469, "num_input_records": 6759, "num_output_records": 6248, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076350838878, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076350840888, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.805639) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.840928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.840931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.840932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.840933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:05:50 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:05:50.840935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:05:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:05:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1824340143' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:05:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:05:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1824340143' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:05:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:51.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:51 np0005591762 nova_compute[225313]: 2026-01-22 10:05:51.749 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:52.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:53.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:53 np0005591762 podman[245343]: 2026-01-22 10:05:53.821056541 +0000 UTC m=+0.041345145 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 05:05:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:54.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:55 np0005591762 nova_compute[225313]: 2026-01-22 10:05:55.426 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:55.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:56.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:56 np0005591762 nova_compute[225313]: 2026-01-22 10:05:56.751 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:05:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:05:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:57.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:05:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:05:58.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:05:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:05:58 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:05:59 np0005591762 podman[245563]: 2026-01-22 10:05:59.450153608 +0000 UTC m=+0.084318623 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 05:05:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:05:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:05:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:05:59.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:05:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:05:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:05:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:05:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:00 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:06:00 np0005591762 nova_compute[225313]: 2026-01-22 10:06:00.427 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:00.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:01.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:01 np0005591762 nova_compute[225313]: 2026-01-22 10:06:01.753 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:02.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:03.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:06:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:04.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:05 np0005591762 nova_compute[225313]: 2026-01-22 10:06:05.428 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:05.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:06.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:06 np0005591762 nova_compute[225313]: 2026-01-22 10:06:06.755 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:07.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:08.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:09.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:10 np0005591762 nova_compute[225313]: 2026-01-22 10:06:10.432 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:10.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:11.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:11 np0005591762 nova_compute[225313]: 2026-01-22 10:06:11.757 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:12.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:13.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:14.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:15 np0005591762 nova_compute[225313]: 2026-01-22 10:06:15.433 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:15.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:16.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:16 np0005591762 nova_compute[225313]: 2026-01-22 10:06:16.759 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:17.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:18.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:18 np0005591762 nova_compute[225313]: 2026-01-22 10:06:18.732 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:18 np0005591762 nova_compute[225313]: 2026-01-22 10:06:18.749 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:18 np0005591762 nova_compute[225313]: 2026-01-22 10:06:18.749 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:06:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:19.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:19 np0005591762 nova_compute[225313]: 2026-01-22 10:06:19.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:19 np0005591762 nova_compute[225313]: 2026-01-22 10:06:19.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:19 np0005591762 nova_compute[225313]: 2026-01-22 10:06:19.724 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:06:19 np0005591762 nova_compute[225313]: 2026-01-22 10:06:19.724 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:06:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:19 np0005591762 nova_compute[225313]: 2026-01-22 10:06:19.746 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:06:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:20 np0005591762 nova_compute[225313]: 2026-01-22 10:06:20.435 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:20.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:20 np0005591762 nova_compute[225313]: 2026-01-22 10:06:20.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:21.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:21 np0005591762 nova_compute[225313]: 2026-01-22 10:06:21.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:21 np0005591762 nova_compute[225313]: 2026-01-22 10:06:21.760 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:22.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:23.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:23 np0005591762 nova_compute[225313]: 2026-01-22 10:06:23.724 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:24.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:24 np0005591762 podman[245704]: 2026-01-22 10:06:24.833069969 +0000 UTC m=+0.049055528 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 05:06:25 np0005591762 nova_compute[225313]: 2026-01-22 10:06:25.439 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:25.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:25 np0005591762 nova_compute[225313]: 2026-01-22 10:06:25.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:26.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:26 np0005591762 nova_compute[225313]: 2026-01-22 10:06:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:26 np0005591762 nova_compute[225313]: 2026-01-22 10:06:26.764 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:27.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:28.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:28 np0005591762 nova_compute[225313]: 2026-01-22 10:06:28.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:06:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:28 np0005591762 nova_compute[225313]: 2026-01-22 10:06:28.749 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:06:28 np0005591762 nova_compute[225313]: 2026-01-22 10:06:28.749 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:06:28 np0005591762 nova_compute[225313]: 2026-01-22 10:06:28.749 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:06:28 np0005591762 nova_compute[225313]: 2026-01-22 10:06:28.749 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:06:28 np0005591762 nova_compute[225313]: 2026-01-22 10:06:28.749 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:06:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:06:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2474770243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.094 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.344s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.323 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.325 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4866MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.326 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.326 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.371 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.371 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.386 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:06:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:29.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:06:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1546957811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:06:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.766 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.380s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.772 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.786 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.787 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:06:29 np0005591762 nova_compute[225313]: 2026-01-22 10:06:29.788 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:06:29 np0005591762 podman[245770]: 2026-01-22 10:06:29.849669253 +0000 UTC m=+0.068786379 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 22 05:06:30 np0005591762 nova_compute[225313]: 2026-01-22 10:06:30.440 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:30.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:31.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:31 np0005591762 nova_compute[225313]: 2026-01-22 10:06:31.766 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:32.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:33.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:34.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:35 np0005591762 nova_compute[225313]: 2026-01-22 10:06:35.442 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:35.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:36.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:36 np0005591762 nova_compute[225313]: 2026-01-22 10:06:36.766 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:37.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:38.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:39.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:40 np0005591762 nova_compute[225313]: 2026-01-22 10:06:40.444 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:40.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:41.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:41 np0005591762 nova_compute[225313]: 2026-01-22 10:06:41.767 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:42.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:43.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:44.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:45 np0005591762 nova_compute[225313]: 2026-01-22 10:06:45.446 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:45.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:46.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:46 np0005591762 nova_compute[225313]: 2026-01-22 10:06:46.769 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:06:47.209 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:06:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:06:47.211 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:06:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:06:47.211 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:06:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:47.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:48.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:49.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:50 np0005591762 nova_compute[225313]: 2026-01-22 10:06:50.447 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:50.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:06:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1764106381' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:06:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:06:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1764106381' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:06:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:51.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:51 np0005591762 nova_compute[225313]: 2026-01-22 10:06:51.772 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:52.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:53.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:54.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:55 np0005591762 nova_compute[225313]: 2026-01-22 10:06:55.448 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:06:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:55.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:06:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:55 np0005591762 podman[245871]: 2026-01-22 10:06:55.826143507 +0000 UTC m=+0.046045671 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 05:06:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:56.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:56 np0005591762 nova_compute[225313]: 2026-01-22 10:06:56.774 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:06:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:06:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:06:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:57.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:06:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:06:58.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:06:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:06:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:06:59.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:06:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:06:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:06:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:06:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:00 np0005591762 nova_compute[225313]: 2026-01-22 10:07:00.449 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:00.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:00 np0005591762 podman[245892]: 2026-01-22 10:07:00.858002278 +0000 UTC m=+0.069089100 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 22 05:07:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:01.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:01 np0005591762 nova_compute[225313]: 2026-01-22 10:07:01.775 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:02.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:03.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:03 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:07:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:07:03 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:07:03 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:07:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:04.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:05 np0005591762 nova_compute[225313]: 2026-01-22 10:07:05.451 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:05.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:07:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:06.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:07:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:06 np0005591762 nova_compute[225313]: 2026-01-22 10:07:06.778 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:07:06 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:07:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:07.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:08.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:07:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:09.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:07:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:10 np0005591762 nova_compute[225313]: 2026-01-22 10:07:10.451 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:10.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:11.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:11 np0005591762 nova_compute[225313]: 2026-01-22 10:07:11.779 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:12.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:13.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:14.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:15 np0005591762 nova_compute[225313]: 2026-01-22 10:07:15.451 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:15.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:16.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:16 np0005591762 nova_compute[225313]: 2026-01-22 10:07:16.782 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:17.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:18.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:19.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.454 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:20.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.788 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.789 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.789 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.789 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.800 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.800 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.800 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:20 np0005591762 nova_compute[225313]: 2026-01-22 10:07:20.800 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:07:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:21.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:21 np0005591762 nova_compute[225313]: 2026-01-22 10:07:21.784 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:22 np0005591762 nova_compute[225313]: 2026-01-22 10:07:22.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:22.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:23.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:07:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:24.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:07:25 np0005591762 nova_compute[225313]: 2026-01-22 10:07:25.454 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:25.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:25 np0005591762 nova_compute[225313]: 2026-01-22 10:07:25.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:07:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:26.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:07:26 np0005591762 nova_compute[225313]: 2026-01-22 10:07:26.784 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:26 np0005591762 podman[246071]: 2026-01-22 10:07:26.823459044 +0000 UTC m=+0.045558261 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 05:07:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:27.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:27 np0005591762 nova_compute[225313]: 2026-01-22 10:07:27.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.741 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.741 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.741 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.741 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:07:28 np0005591762 nova_compute[225313]: 2026-01-22 10:07:28.742 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:07:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:28.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:07:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/428737560' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.095 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.353s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.291 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.292 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4862MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.292 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.292 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.420 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.421 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.434 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:07:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:29.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:29 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:07:29 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3885434280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.771 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.777 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.795 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.797 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:07:29 np0005591762 nova_compute[225313]: 2026-01-22 10:07:29.797 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:07:30 np0005591762 nova_compute[225313]: 2026-01-22 10:07:30.455 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:30.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:31.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:31 np0005591762 nova_compute[225313]: 2026-01-22 10:07:31.786 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:31 np0005591762 podman[246136]: 2026-01-22 10:07:31.846309696 +0000 UTC m=+0.070642069 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 05:07:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:32.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:33.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:34.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:35 np0005591762 nova_compute[225313]: 2026-01-22 10:07:35.457 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:35.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:36.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:36 np0005591762 nova_compute[225313]: 2026-01-22 10:07:36.789 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:37.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:38.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:39.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:40 np0005591762 nova_compute[225313]: 2026-01-22 10:07:40.459 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:40.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:41.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:41 np0005591762 nova_compute[225313]: 2026-01-22 10:07:41.792 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:42.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:43.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:44.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:45 np0005591762 nova_compute[225313]: 2026-01-22 10:07:45.460 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:45.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:46.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:46 np0005591762 nova_compute[225313]: 2026-01-22 10:07:46.793 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:07:47.212 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:07:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:07:47.212 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:07:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:07:47.212 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:07:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:47.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:48.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:49.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:50 np0005591762 nova_compute[225313]: 2026-01-22 10:07:50.462 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:50.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:07:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3787875451' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:07:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:07:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3787875451' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:07:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:51.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:51 np0005591762 nova_compute[225313]: 2026-01-22 10:07:51.796 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:07:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:52.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:07:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:53.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:54.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:55 np0005591762 nova_compute[225313]: 2026-01-22 10:07:55.463 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:55.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:56.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:56 np0005591762 nova_compute[225313]: 2026-01-22 10:07:56.799 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:07:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:07:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:07:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:57.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:07:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:57 np0005591762 podman[246235]: 2026-01-22 10:07:57.843079596 +0000 UTC m=+0.064906142 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 05:07:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:07:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:07:58.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:07:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:07:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:07:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:07:59.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:07:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:07:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:07:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:07:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:00 np0005591762 nova_compute[225313]: 2026-01-22 10:08:00.467 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:00.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:08:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:01.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:08:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:01 np0005591762 nova_compute[225313]: 2026-01-22 10:08:01.805 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:02.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:02 np0005591762 podman[246256]: 2026-01-22 10:08:02.8600513 +0000 UTC m=+0.078835401 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 05:08:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:08:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:03.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:08:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:04.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:05 np0005591762 nova_compute[225313]: 2026-01-22 10:08:05.466 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:05.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:06 np0005591762 nova_compute[225313]: 2026-01-22 10:08:06.810 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:07.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:08:07 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:08:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:08.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:09.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:10 np0005591762 nova_compute[225313]: 2026-01-22 10:08:10.468 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:08:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:10.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:08:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:08:10 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:08:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:11.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:11 np0005591762 nova_compute[225313]: 2026-01-22 10:08:11.811 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:12.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:13.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:14.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:15 np0005591762 nova_compute[225313]: 2026-01-22 10:08:15.469 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:15.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:16.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:16 np0005591762 nova_compute[225313]: 2026-01-22 10:08:16.812 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:17.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:18.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:19.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:20 np0005591762 nova_compute[225313]: 2026-01-22 10:08:20.472 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:08:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:20.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:08:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:21.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.794 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.794 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.795 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.795 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.812 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.882 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.882 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:21 np0005591762 nova_compute[225313]: 2026-01-22 10:08:21.882 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:08:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:22 np0005591762 nova_compute[225313]: 2026-01-22 10:08:22.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:22 np0005591762 nova_compute[225313]: 2026-01-22 10:08:22.735 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:22 np0005591762 nova_compute[225313]: 2026-01-22 10:08:22.735 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:22.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:23.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:24 np0005591762 nova_compute[225313]: 2026-01-22 10:08:24.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:24 np0005591762 nova_compute[225313]: 2026-01-22 10:08:24.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 05:08:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:24.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:25 np0005591762 nova_compute[225313]: 2026-01-22 10:08:25.474 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:25.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:26 np0005591762 nova_compute[225313]: 2026-01-22 10:08:26.750 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:26 np0005591762 nova_compute[225313]: 2026-01-22 10:08:26.751 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 05:08:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:26 np0005591762 nova_compute[225313]: 2026-01-22 10:08:26.764 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 05:08:26 np0005591762 nova_compute[225313]: 2026-01-22 10:08:26.813 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:26.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:27.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:27 np0005591762 nova_compute[225313]: 2026-01-22 10:08:27.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:27 np0005591762 nova_compute[225313]: 2026-01-22 10:08:27.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:28 np0005591762 nova_compute[225313]: 2026-01-22 10:08:28.736 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:28 np0005591762 nova_compute[225313]: 2026-01-22 10:08:28.736 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:28.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:28 np0005591762 podman[246434]: 2026-01-22 10:08:28.819892186 +0000 UTC m=+0.043653680 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 22 05:08:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:29.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.476 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.744 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.744 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.744 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.744 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:08:30 np0005591762 nova_compute[225313]: 2026-01-22 10:08:30.744 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:08:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:30.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:08:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2840877305' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.095 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.351s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.279 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.280 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4859MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.280 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.281 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.352 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.353 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.391 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing inventories for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.457 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating ProviderTree inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.457 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.469 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing aggregate associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.487 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing trait associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, traits: HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX512VAES,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AESNI,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.499 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:08:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:31.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.814 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.875 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.377s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.879 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.891 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.892 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:08:31 np0005591762 nova_compute[225313]: 2026-01-22 10:08:31.892 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:08:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:32.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:33.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:33 np0005591762 podman[246499]: 2026-01-22 10:08:33.85502339 +0000 UTC m=+0.071419716 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller)
Jan 22 05:08:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:34.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:35 np0005591762 nova_compute[225313]: 2026-01-22 10:08:35.478 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:35.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:36 np0005591762 nova_compute[225313]: 2026-01-22 10:08:36.816 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:36.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:37.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:38.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:39.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:40 np0005591762 nova_compute[225313]: 2026-01-22 10:08:40.480 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:40.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:41.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:41 np0005591762 nova_compute[225313]: 2026-01-22 10:08:41.817 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:41 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:42.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:43.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:44.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:45 np0005591762 nova_compute[225313]: 2026-01-22 10:08:45.481 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:45.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:46 np0005591762 nova_compute[225313]: 2026-01-22 10:08:46.818 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:46.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:46 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:08:47.213 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:08:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:08:47.213 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:08:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:08:47.213 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:08:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:47.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:48.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:08:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:49.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:08:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:50 np0005591762 nova_compute[225313]: 2026-01-22 10:08:50.483 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:50.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:08:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:51.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:08:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:51 np0005591762 nova_compute[225313]: 2026-01-22 10:08:51.821 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:52.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:53.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:54.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:55 np0005591762 nova_compute[225313]: 2026-01-22 10:08:55.484 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:55.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:56 np0005591762 nova_compute[225313]: 2026-01-22 10:08:56.824 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:08:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:56.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:56 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:08:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:57.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:08:58.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:08:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:08:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:08:59.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:08:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:08:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:08:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:08:59 np0005591762 podman[246599]: 2026-01-22 10:08:59.813188626 +0000 UTC m=+0.037331176 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 05:09:00 np0005591762 nova_compute[225313]: 2026-01-22 10:09:00.486 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:00.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:01.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:01 np0005591762 nova_compute[225313]: 2026-01-22 10:09:01.826 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:02.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:03.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:04 np0005591762 podman[246619]: 2026-01-22 10:09:04.845167513 +0000 UTC m=+0.063248565 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 05:09:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:09:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:04.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:09:05 np0005591762 nova_compute[225313]: 2026-01-22 10:09:05.488 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:05.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:06 np0005591762 nova_compute[225313]: 2026-01-22 10:09:06.829 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:06.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:07.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:08.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:09.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:10 np0005591762 nova_compute[225313]: 2026-01-22 10:09:10.490 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:10.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:11 np0005591762 podman[246755]: 2026-01-22 10:09:11.49026431 +0000 UTC m=+0.052586938 container exec f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 05:09:11 np0005591762 podman[246755]: 2026-01-22 10:09:11.596647925 +0000 UTC m=+0.158970562 container exec_died f08ed0453a484a85ad229a2e188b55169a8c2545992181be6864ff55762bbab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 22 05:09:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:11.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:11 np0005591762 nova_compute[225313]: 2026-01-22 10:09:11.831 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:11 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:12 np0005591762 podman[246864]: 2026-01-22 10:09:12.085450195 +0000 UTC m=+0.050273704 container exec 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 05:09:12 np0005591762 podman[246864]: 2026-01-22 10:09:12.09667843 +0000 UTC m=+0.061501929 container exec_died 30cd3f77ecd170550c59460efc32670be78bd0b27f194ac797ffbd185f62d000 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 05:09:12 np0005591762 podman[246935]: 2026-01-22 10:09:12.361855783 +0000 UTC m=+0.048216384 container exec e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 05:09:12 np0005591762 podman[246935]: 2026-01-22 10:09:12.373680705 +0000 UTC m=+0.060041306 container exec_died e696e5c55c2acb57b82a8ead479495469741ca45accd76ed01a65e605a74644e (image=quay.io/ceph/haproxy:2.3, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-haproxy-rgw-default-compute-2-czpvbf)
Jan 22 05:09:12 np0005591762 podman[246985]: 2026-01-22 10:09:12.556618106 +0000 UTC m=+0.041512501 container exec 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, name=keepalived, release=1793, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, distribution-scope=public, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, architecture=x86_64, io.buildah.version=1.28.2, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 22 05:09:12 np0005591762 podman[246985]: 2026-01-22 10:09:12.582598614 +0000 UTC m=+0.067493009 container exec_died 88cbed3c53cfeca8f2b144595f8887ef027b68eabf8d5e2e908197c993c91fe0 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, name=keepalived, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, vcs-type=git, version=2.2.4, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9)
Jan 22 05:09:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:12 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:12.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.444962) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076553445040, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2603, "num_deletes": 503, "total_data_size": 6298723, "memory_usage": 6406984, "flush_reason": "Manual Compaction"}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076553453571, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 3670111, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31904, "largest_seqno": 34502, "table_properties": {"data_size": 3660683, "index_size": 5346, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 23766, "raw_average_key_size": 20, "raw_value_size": 3639361, "raw_average_value_size": 3071, "num_data_blocks": 231, "num_entries": 1185, "num_filter_entries": 1185, "num_deletions": 503, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769076351, "oldest_key_time": 1769076351, "file_creation_time": 1769076553, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 8646 microseconds, and 6586 cpu microseconds.
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.453621) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 3670111 bytes OK
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.453635) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 22 05:09:13 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.453992) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 22 05:09:13 np0005591762 rsyslogd[963]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.454003) EVENT_LOG_v1 {"time_micros": 1769076553453999, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.454017) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6286383, prev total WAL file size 6286383, number of live WAL files 2.
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.455700) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(3584KB)], [63(14MB)]
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076553455740, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18482580, "oldest_snapshot_seqno": -1}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6431 keys, 12562125 bytes, temperature: kUnknown
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076553485197, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 12562125, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12522102, "index_size": 22825, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16133, "raw_key_size": 168094, "raw_average_key_size": 26, "raw_value_size": 12408811, "raw_average_value_size": 1929, "num_data_blocks": 897, "num_entries": 6431, "num_filter_entries": 6431, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769076553, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.485485) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 12562125 bytes
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.485960) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 625.4 rd, 425.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 14.1 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(8.5) write-amplify(3.4) OK, records in: 7433, records dropped: 1002 output_compression: NoCompression
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.485979) EVENT_LOG_v1 {"time_micros": 1769076553485969, "job": 38, "event": "compaction_finished", "compaction_time_micros": 29552, "compaction_time_cpu_micros": 24619, "output_level": 6, "num_output_files": 1, "total_output_size": 12562125, "num_input_records": 7433, "num_output_records": 6431, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076553486615, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076553488851, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.455611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.488899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.488901) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.488902) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.488903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:09:13.488905) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:09:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000021s ======
Jan 22 05:09:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:13.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000021s
Jan 22 05:09:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:13 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:09:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:14.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:15 np0005591762 nova_compute[225313]: 2026-01-22 10:09:15.492 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:15.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:16 np0005591762 nova_compute[225313]: 2026-01-22 10:09:16.834 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:16.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:16 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:09:16 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:17.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:18.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:19.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:20 np0005591762 nova_compute[225313]: 2026-01-22 10:09:20.495 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:20.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:21 np0005591762 nova_compute[225313]: 2026-01-22 10:09:21.464 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:21 np0005591762 nova_compute[225313]: 2026-01-22 10:09:21.737 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:21 np0005591762 nova_compute[225313]: 2026-01-22 10:09:21.737 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:09:21 np0005591762 nova_compute[225313]: 2026-01-22 10:09:21.738 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:09:21 np0005591762 nova_compute[225313]: 2026-01-22 10:09:21.749 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:09:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:21.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:21 np0005591762 nova_compute[225313]: 2026-01-22 10:09:21.837 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:21 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:22 np0005591762 nova_compute[225313]: 2026-01-22 10:09:22.721 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:22 np0005591762 nova_compute[225313]: 2026-01-22 10:09:22.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:22.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:23 np0005591762 nova_compute[225313]: 2026-01-22 10:09:23.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:23 np0005591762 nova_compute[225313]: 2026-01-22 10:09:23.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:09:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:23.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:24 np0005591762 nova_compute[225313]: 2026-01-22 10:09:24.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:24.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:25 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:09:25 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4021274664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:09:25 np0005591762 nova_compute[225313]: 2026-01-22 10:09:25.497 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:25.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:26 np0005591762 nova_compute[225313]: 2026-01-22 10:09:26.839 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:26.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:26 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:27 np0005591762 nova_compute[225313]: 2026-01-22 10:09:27.724 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:27.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:28.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:29 np0005591762 nova_compute[225313]: 2026-01-22 10:09:29.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:29.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.497 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.739 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.739 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:09:30 np0005591762 nova_compute[225313]: 2026-01-22 10:09:30.739 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:09:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:30 np0005591762 podman[247215]: 2026-01-22 10:09:30.823994624 +0000 UTC m=+0.041287687 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 05:09:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:30.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:09:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2214780208' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.147 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.377 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.378 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4843MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.379 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.379 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.422 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.422 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.433 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:09:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:31.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:31 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:09:31 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2058536884' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.793 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.360s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.797 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.809 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.810 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.810 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:09:31 np0005591762 nova_compute[225313]: 2026-01-22 10:09:31.846 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:32.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:33.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:34.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:35 np0005591762 podman[247301]: 2026-01-22 10:09:35.058646429 +0000 UTC m=+0.066006066 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 05:09:35 np0005591762 nova_compute[225313]: 2026-01-22 10:09:35.499 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:35.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:36 np0005591762 nova_compute[225313]: 2026-01-22 10:09:36.848 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:36.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:37.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:38.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:39.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:40 np0005591762 nova_compute[225313]: 2026-01-22 10:09:40.501 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:09:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:40.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:09:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:41.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:41 np0005591762 nova_compute[225313]: 2026-01-22 10:09:41.849 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:42.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:43.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:44.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:45 np0005591762 nova_compute[225313]: 2026-01-22 10:09:45.505 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:45.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:46 np0005591762 nova_compute[225313]: 2026-01-22 10:09:46.850 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:46.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:09:47.213 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:09:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:09:47.214 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:09:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:09:47.214 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:09:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:47.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:49.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:50 np0005591762 nova_compute[225313]: 2026-01-22 10:09:50.507 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:51.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:51 np0005591762 nova_compute[225313]: 2026-01-22 10:09:51.851 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:53.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:54.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:55 np0005591762 nova_compute[225313]: 2026-01-22 10:09:55.509 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:55.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:56 np0005591762 nova_compute[225313]: 2026-01-22 10:09:56.852 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:09:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:09:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:56.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:09:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:09:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:09:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:57.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:09:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:09:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:09:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:09:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:09:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:09:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:09:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:09:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:09:59.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:00 np0005591762 nova_compute[225313]: 2026-01-22 10:10:00.512 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:00 np0005591762 ceph-mon[75519]: overall HEALTH_WARN 1 failed cephadm daemon(s)
Jan 22 05:10:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:00.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:01.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:01 np0005591762 podman[247377]: 2026-01-22 10:10:01.825400265 +0000 UTC m=+0.048060323 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 05:10:01 np0005591762 nova_compute[225313]: 2026-01-22 10:10:01.853 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:02.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:03.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:04.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:05 np0005591762 nova_compute[225313]: 2026-01-22 10:10:05.513 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:05.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:05 np0005591762 podman[247397]: 2026-01-22 10:10:05.857335877 +0000 UTC m=+0.074817560 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 05:10:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:06 np0005591762 nova_compute[225313]: 2026-01-22 10:10:06.855 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:06.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:07.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:08.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:10:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:09.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:10:10 np0005591762 nova_compute[225313]: 2026-01-22 10:10:10.514 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:10.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:10:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:11.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:10:11 np0005591762 nova_compute[225313]: 2026-01-22 10:10:11.857 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:10:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:12.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:10:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:13.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:10:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:14.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:10:15 np0005591762 nova_compute[225313]: 2026-01-22 10:10:15.517 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:15.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:16 np0005591762 nova_compute[225313]: 2026-01-22 10:10:16.858 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:16.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:17 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:10:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:10:17 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:10:17 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:10:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:17.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:18.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:19.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:20 np0005591762 nova_compute[225313]: 2026-01-22 10:10:20.521 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:10:20 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:10:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:10:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:20.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:10:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:21.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:21 np0005591762 nova_compute[225313]: 2026-01-22 10:10:21.861 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:22.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:23 np0005591762 nova_compute[225313]: 2026-01-22 10:10:23.812 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:23 np0005591762 nova_compute[225313]: 2026-01-22 10:10:23.813 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:23.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:23 np0005591762 nova_compute[225313]: 2026-01-22 10:10:23.970 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:23 np0005591762 nova_compute[225313]: 2026-01-22 10:10:23.971 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:10:23 np0005591762 nova_compute[225313]: 2026-01-22 10:10:23.971 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:10:23 np0005591762 nova_compute[225313]: 2026-01-22 10:10:23.982 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:10:24 np0005591762 nova_compute[225313]: 2026-01-22 10:10:24.721 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:24 np0005591762 nova_compute[225313]: 2026-01-22 10:10:24.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:24 np0005591762 nova_compute[225313]: 2026-01-22 10:10:24.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:24 np0005591762 nova_compute[225313]: 2026-01-22 10:10:24.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:10:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:24.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:25 np0005591762 nova_compute[225313]: 2026-01-22 10:10:25.523 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:25.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:26 np0005591762 nova_compute[225313]: 2026-01-22 10:10:26.863 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:26.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:27.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:10:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:28.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:10:29 np0005591762 nova_compute[225313]: 2026-01-22 10:10:29.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:29.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:30 np0005591762 nova_compute[225313]: 2026-01-22 10:10:30.525 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:30 np0005591762 nova_compute[225313]: 2026-01-22 10:10:30.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:30.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:31.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:31 np0005591762 nova_compute[225313]: 2026-01-22 10:10:31.866 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.746 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:10:32 np0005591762 nova_compute[225313]: 2026-01-22 10:10:32.746 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:10:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:32 np0005591762 podman[247579]: 2026-01-22 10:10:32.848101216 +0000 UTC m=+0.068724368 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 05:10:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:32.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:10:33 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/954465639' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.091 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.345s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.293 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.295 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4841MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.295 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.295 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.350 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.351 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.365 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:10:33 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:10:33 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2476559011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.708 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.343s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.712 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.726 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.728 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:10:33 np0005591762 nova_compute[225313]: 2026-01-22 10:10:33.728 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:10:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:33.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:34.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:35 np0005591762 nova_compute[225313]: 2026-01-22 10:10:35.526 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:35.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:36 np0005591762 podman[247667]: 2026-01-22 10:10:36.857712592 +0000 UTC m=+0.083012774 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 05:10:36 np0005591762 nova_compute[225313]: 2026-01-22 10:10:36.868 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:36.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:37.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:38.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:39.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:40 np0005591762 nova_compute[225313]: 2026-01-22 10:10:40.528 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:40.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:41.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:41 np0005591762 nova_compute[225313]: 2026-01-22 10:10:41.868 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:42.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:43.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:44.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:45 np0005591762 nova_compute[225313]: 2026-01-22 10:10:45.530 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:45.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:46 np0005591762 nova_compute[225313]: 2026-01-22 10:10:46.871 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:46.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:10:47.216 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:10:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:10:47.216 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:10:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:10:47.216 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:10:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:47.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:48.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:49.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:50 np0005591762 nova_compute[225313]: 2026-01-22 10:10:50.531 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:50.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 22 05:10:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938882043' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 22 05:10:51 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 22 05:10:51 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938882043' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 22 05:10:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:51.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:51 np0005591762 nova_compute[225313]: 2026-01-22 10:10:51.873 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:52.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:53.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:54.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:55 np0005591762 nova_compute[225313]: 2026-01-22 10:10:55.532 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:10:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:55.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:10:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:56 np0005591762 nova_compute[225313]: 2026-01-22 10:10:56.876 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:10:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:56.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:10:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:57.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:10:58.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:10:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:10:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:10:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:10:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:10:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:10:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:10:59.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:00 np0005591762 nova_compute[225313]: 2026-01-22 10:11:00.533 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:00.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:01 np0005591762 nova_compute[225313]: 2026-01-22 10:11:01.879 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:01.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:02.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:03 np0005591762 podman[247742]: 2026-01-22 10:11:03.857635393 +0000 UTC m=+0.079348935 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 05:11:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:03.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:04.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:05 np0005591762 nova_compute[225313]: 2026-01-22 10:11:05.534 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:05.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:06 np0005591762 nova_compute[225313]: 2026-01-22 10:11:06.882 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:06.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:07 np0005591762 podman[247762]: 2026-01-22 10:11:07.853594834 +0000 UTC m=+0.072458571 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 05:11:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:07.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:08 np0005591762 nova_compute[225313]: 2026-01-22 10:11:08.227 225317 DEBUG oslo_concurrency.processutils [None req-effb9288-63a0-473f-963a-bb1b4e8bdf74 12c3378977944a34b6df27af0c168a73 a894ac5b4f744f208fa506d5e8f67970 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:11:08 np0005591762 nova_compute[225313]: 2026-01-22 10:11:08.265 225317 DEBUG oslo_concurrency.processutils [None req-effb9288-63a0-473f-963a-bb1b4e8bdf74 12c3378977944a34b6df27af0c168a73 a894ac5b4f744f208fa506d5e8f67970 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:11:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:08.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:09.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:10 np0005591762 nova_compute[225313]: 2026-01-22 10:11:10.536 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:11 np0005591762 nova_compute[225313]: 2026-01-22 10:11:11.884 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:11.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:12 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:11:12.656 143150 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:52:1d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ec:a7:e9:bb:bd'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 05:11:12 np0005591762 nova_compute[225313]: 2026-01-22 10:11:12.656 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:12 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:11:12.657 143150 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 05:11:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:13.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:13.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:15.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:15 np0005591762 nova_compute[225313]: 2026-01-22 10:11:15.537 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:15.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:16 np0005591762 nova_compute[225313]: 2026-01-22 10:11:16.886 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:17.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:17.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:19.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:19.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:20 np0005591762 nova_compute[225313]: 2026-01-22 10:11:20.540 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:21.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:21 np0005591762 nova_compute[225313]: 2026-01-22 10:11:21.890 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:21.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:21 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:11:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:11:21 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:11:21 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:11:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:22 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:11:22.658 143150 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61e0485d-79f8-4954-8f50-00743b2f8934, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 05:11:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:23.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:23.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:24 np0005591762 nova_compute[225313]: 2026-01-22 10:11:24.730 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:24 np0005591762 nova_compute[225313]: 2026-01-22 10:11:24.730 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:24 np0005591762 nova_compute[225313]: 2026-01-22 10:11:24.731 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:11:24 np0005591762 nova_compute[225313]: 2026-01-22 10:11:24.731 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:11:24 np0005591762 nova_compute[225313]: 2026-01-22 10:11:24.744 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:11:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:11:24 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:11:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:25.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:25 np0005591762 nova_compute[225313]: 2026-01-22 10:11:25.541 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:25 np0005591762 nova_compute[225313]: 2026-01-22 10:11:25.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:25 np0005591762 nova_compute[225313]: 2026-01-22 10:11:25.722 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:11:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:25.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:26 np0005591762 nova_compute[225313]: 2026-01-22 10:11:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:26 np0005591762 nova_compute[225313]: 2026-01-22 10:11:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:26 np0005591762 nova_compute[225313]: 2026-01-22 10:11:26.891 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:27.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:27.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:29.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:29 np0005591762 nova_compute[225313]: 2026-01-22 10:11:29.725 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:29.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:30 np0005591762 nova_compute[225313]: 2026-01-22 10:11:30.542 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:30 np0005591762 nova_compute[225313]: 2026-01-22 10:11:30.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:31.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:31 np0005591762 nova_compute[225313]: 2026-01-22 10:11:31.893 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:31.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:33.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.746 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.746 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.747 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.747 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:11:33 np0005591762 nova_compute[225313]: 2026-01-22 10:11:33.747 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:11:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:33.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:11:34 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1624399559' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.136 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.387 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.388 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4883MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.390 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.390 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.461 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.461 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.475 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:11:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:34 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:11:34 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4289303631' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:11:34 np0005591762 podman[247984]: 2026-01-22 10:11:34.833278692 +0000 UTC m=+0.048486347 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.844 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.850 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.863 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.864 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:11:34 np0005591762 nova_compute[225313]: 2026-01-22 10:11:34.865 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:11:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:35.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:35 np0005591762 nova_compute[225313]: 2026-01-22 10:11:35.544 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:35.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:36 np0005591762 nova_compute[225313]: 2026-01-22 10:11:36.897 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:37.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:37.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:38 np0005591762 podman[248031]: 2026-01-22 10:11:38.888509328 +0000 UTC m=+0.098879189 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 05:11:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:39.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:39.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:40 np0005591762 nova_compute[225313]: 2026-01-22 10:11:40.545 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:41.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:41 np0005591762 nova_compute[225313]: 2026-01-22 10:11:41.898 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:41.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:43.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:43.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:45.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:45 np0005591762 nova_compute[225313]: 2026-01-22 10:11:45.547 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:45.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:46 np0005591762 nova_compute[225313]: 2026-01-22 10:11:46.900 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:47.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:11:47.215 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:11:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:11:47.216 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:11:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:11:47.216 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:11:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.004000041s ======
Jan 22 05:11:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:47.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000041s
Jan 22 05:11:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:49.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:49.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:50 np0005591762 nova_compute[225313]: 2026-01-22 10:11:50.550 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:51.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:51 np0005591762 nova_compute[225313]: 2026-01-22 10:11:51.904 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:51.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:53.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:53.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:55.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:55 np0005591762 nova_compute[225313]: 2026-01-22 10:11:55.549 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:55.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:56 np0005591762 nova_compute[225313]: 2026-01-22 10:11:56.906 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:11:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:11:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:11:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:57.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:11:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:11:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:57.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:11:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:11:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:11:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:11:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:11:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:11:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:11:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:11:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:11:59.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:00 np0005591762 nova_compute[225313]: 2026-01-22 10:12:00.552 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.002000021s ======
Jan 22 05:12:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:01.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000021s
Jan 22 05:12:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:01 np0005591762 nova_compute[225313]: 2026-01-22 10:12:01.908 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:01.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:03.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:03.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:05.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:05 np0005591762 nova_compute[225313]: 2026-01-22 10:12:05.553 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:05 np0005591762 podman[248107]: 2026-01-22 10:12:05.824042833 +0000 UTC m=+0.044121154 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 05:12:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:05.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:06 np0005591762 nova_compute[225313]: 2026-01-22 10:12:06.910 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:07.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:07.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:09.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:09 np0005591762 podman[248127]: 2026-01-22 10:12:09.847102274 +0000 UTC m=+0.067472947 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 05:12:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:09.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:10 np0005591762 nova_compute[225313]: 2026-01-22 10:12:10.554 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:11.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:11 np0005591762 nova_compute[225313]: 2026-01-22 10:12:11.913 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:11.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:13.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:13.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:15.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:15 np0005591762 nova_compute[225313]: 2026-01-22 10:12:15.555 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:15.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:16 np0005591762 nova_compute[225313]: 2026-01-22 10:12:16.915 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:17.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:17.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:19.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:19.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:20 np0005591762 nova_compute[225313]: 2026-01-22 10:12:20.557 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:21.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:21 np0005591762 nova_compute[225313]: 2026-01-22 10:12:21.917 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:21.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:23.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:23.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:24 np0005591762 nova_compute[225313]: 2026-01-22 10:12:24.860 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:25.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:25 np0005591762 nova_compute[225313]: 2026-01-22 10:12:25.560 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:25 np0005591762 nova_compute[225313]: 2026-01-22 10:12:25.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:25 np0005591762 nova_compute[225313]: 2026-01-22 10:12:25.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:12:25 np0005591762 nova_compute[225313]: 2026-01-22 10:12:25.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:12:25 np0005591762 nova_compute[225313]: 2026-01-22 10:12:25.734 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:12:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:25.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:26 np0005591762 nova_compute[225313]: 2026-01-22 10:12:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:26 np0005591762 nova_compute[225313]: 2026-01-22 10:12:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:26 np0005591762 nova_compute[225313]: 2026-01-22 10:12:26.724 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:12:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:26 np0005591762 nova_compute[225313]: 2026-01-22 10:12:26.920 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:27.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:27 np0005591762 nova_compute[225313]: 2026-01-22 10:12:27.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:28.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:28 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:12:28 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:12:28 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:12:28 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:12:28 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:12:28 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:12:28 np0005591762 nova_compute[225313]: 2026-01-22 10:12:28.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:29.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:30.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:30 np0005591762 nova_compute[225313]: 2026-01-22 10:12:30.562 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:12:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:12:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:31.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:31 np0005591762 nova_compute[225313]: 2026-01-22 10:12:31.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:31 np0005591762 nova_compute[225313]: 2026-01-22 10:12:31.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:31 np0005591762 nova_compute[225313]: 2026-01-22 10:12:31.923 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:32.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:33.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:34.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.741 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.742 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.742 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.742 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:12:34 np0005591762 nova_compute[225313]: 2026-01-22 10:12:34.742 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:12:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:35.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:12:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3965042102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.137 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.385 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.387 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4877MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.387 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.387 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.442 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.442 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.454 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.563 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:35 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:12:35 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1437416949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.837 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.383s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.842 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.857 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.858 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:12:35 np0005591762 nova_compute[225313]: 2026-01-22 10:12:35.858 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:12:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:36.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:36 np0005591762 podman[248375]: 2026-01-22 10:12:36.853152799 +0000 UTC m=+0.067228695 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 05:12:36 np0005591762 nova_compute[225313]: 2026-01-22 10:12:36.926 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:37.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:38.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:39.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:40.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:40 np0005591762 nova_compute[225313]: 2026-01-22 10:12:40.564 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:40 np0005591762 podman[248395]: 2026-01-22 10:12:40.851886131 +0000 UTC m=+0.072347902 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 05:12:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:41.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:41 np0005591762 nova_compute[225313]: 2026-01-22 10:12:41.927 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:42.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:43.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.624299) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076763624362, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2616, "num_deletes": 508, "total_data_size": 6253289, "memory_usage": 6368960, "flush_reason": "Manual Compaction"}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076763634356, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 4036429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34507, "largest_seqno": 37118, "table_properties": {"data_size": 4026351, "index_size": 5932, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 23679, "raw_average_key_size": 19, "raw_value_size": 4004064, "raw_average_value_size": 3284, "num_data_blocks": 257, "num_entries": 1219, "num_filter_entries": 1219, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769076553, "oldest_key_time": 1769076553, "file_creation_time": 1769076763, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 10082 microseconds, and 8324 cpu microseconds.
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.634390) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 4036429 bytes OK
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.634431) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.634775) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.634787) EVENT_LOG_v1 {"time_micros": 1769076763634784, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.634806) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 6240877, prev total WAL file size 6240877, number of live WAL files 2.
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.635700) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(3941KB)], [66(11MB)]
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076763635729, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 16598554, "oldest_snapshot_seqno": -1}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6619 keys, 14343155 bytes, temperature: kUnknown
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076763668834, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 14343155, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14299988, "index_size": 25519, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 173951, "raw_average_key_size": 26, "raw_value_size": 14181435, "raw_average_value_size": 2142, "num_data_blocks": 1005, "num_entries": 6619, "num_filter_entries": 6619, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769074431, "oldest_key_time": 0, "file_creation_time": 1769076763, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06bb68f0-fa7d-4244-8ddc-ccad0aff042d", "db_session_id": "N2XKLI89NZ0D85XH10TG", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.669132) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 14343155 bytes
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.669634) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 498.8 rd, 431.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 12.0 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 7650, records dropped: 1031 output_compression: NoCompression
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.669649) EVENT_LOG_v1 {"time_micros": 1769076763669642, "job": 40, "event": "compaction_finished", "compaction_time_micros": 33279, "compaction_time_cpu_micros": 23222, "output_level": 6, "num_output_files": 1, "total_output_size": 14343155, "num_input_records": 7650, "num_output_records": 6619, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076763670531, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769076763672468, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.635655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.672638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.672646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.672647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.672649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:12:43 np0005591762 ceph-mon[75519]: rocksdb: (Original Log Time 2026/01/22-10:12:43.672651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 22 05:12:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:44.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:45.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:45 np0005591762 nova_compute[225313]: 2026-01-22 10:12:45.568 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:46.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:46 np0005591762 nova_compute[225313]: 2026-01-22 10:12:46.930 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:47.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:12:47.217 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:12:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:12:47.217 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:12:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:12:47.218 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:12:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:48.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:49.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:50.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:50 np0005591762 nova_compute[225313]: 2026-01-22 10:12:50.571 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:51.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:51 np0005591762 nova_compute[225313]: 2026-01-22 10:12:51.933 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:52.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:53.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:54.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:55.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:55 np0005591762 nova_compute[225313]: 2026-01-22 10:12:55.574 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:12:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:56.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:12:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:56 np0005591762 nova_compute[225313]: 2026-01-22 10:12:56.935 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:12:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:12:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:57.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:12:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:12:58.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:12:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:12:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:12:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:12:59.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:12:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:12:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:12:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:12:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:00.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:00 np0005591762 nova_compute[225313]: 2026-01-22 10:13:00.575 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:01 np0005591762 nova_compute[225313]: 2026-01-22 10:13:01.937 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:02.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:03.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:04.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:05.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:05 np0005591762 nova_compute[225313]: 2026-01-22 10:13:05.577 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:06.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:06 np0005591762 nova_compute[225313]: 2026-01-22 10:13:06.940 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:07.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:07 np0005591762 podman[248470]: 2026-01-22 10:13:07.822989305 +0000 UTC m=+0.044927905 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 05:13:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:08.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:09.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:10.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:10 np0005591762 nova_compute[225313]: 2026-01-22 10:13:10.579 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:11.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:11 np0005591762 podman[248490]: 2026-01-22 10:13:11.882130789 +0000 UTC m=+0.099804987 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 05:13:11 np0005591762 nova_compute[225313]: 2026-01-22 10:13:11.941 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:12.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:13.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:14.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:15.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:15 np0005591762 nova_compute[225313]: 2026-01-22 10:13:15.582 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:16 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:16 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:16 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:16.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:16 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:16 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:16 np0005591762 nova_compute[225313]: 2026-01-22 10:13:16.943 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:17 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:17 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:17 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:17 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:17.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:17 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:17 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:18 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:18 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:18 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:18.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:18 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:18 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:19 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:19 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:19 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:19.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:19 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:19 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:20 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:20 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:20 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:20.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:20 np0005591762 nova_compute[225313]: 2026-01-22 10:13:20.585 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:20 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:20 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:21 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:21 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:21 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:21.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:21 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:21 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:21 np0005591762 nova_compute[225313]: 2026-01-22 10:13:21.947 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:22 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:22 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:22 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:22 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:22 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:22 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:23 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:23 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:23 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:23.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:23 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:23 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:24 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:24 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:24 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:24.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:24 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:24 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:25 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:25 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:25 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:25.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:25 np0005591762 nova_compute[225313]: 2026-01-22 10:13:25.586 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:25 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:25 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:26 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:26 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:26 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:26.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.735 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.735 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.735 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.744 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 05:13:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:26 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:26 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:26 np0005591762 nova_compute[225313]: 2026-01-22 10:13:26.950 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:27 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:27 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:27 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:27 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:27.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:27 np0005591762 nova_compute[225313]: 2026-01-22 10:13:27.731 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:27 np0005591762 nova_compute[225313]: 2026-01-22 10:13:27.731 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 05:13:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:27 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:27 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:28 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:28 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:28 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:28.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:28 np0005591762 nova_compute[225313]: 2026-01-22 10:13:28.719 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:28 np0005591762 nova_compute[225313]: 2026-01-22 10:13:28.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:28 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:28 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:29 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:29 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:29 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:29.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:29 np0005591762 nova_compute[225313]: 2026-01-22 10:13:29.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:29 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:29 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:30 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:30 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:30 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:30.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:30 np0005591762 nova_compute[225313]: 2026-01-22 10:13:30.587 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:30 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:30 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:31 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:31 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:31 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:31.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:31 np0005591762 nova_compute[225313]: 2026-01-22 10:13:31.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:31 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:31 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:31 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 22 05:13:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:13:31 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:13:31 np0005591762 ceph-mon[75519]: from='mgr.14664 192.168.122.100:0/1879606959' entity='mgr.compute-0.rfmoog' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 22 05:13:31 np0005591762 nova_compute[225313]: 2026-01-22 10:13:31.950 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:32 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:32 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:32 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:32 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:32.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:32 np0005591762 nova_compute[225313]: 2026-01-22 10:13:32.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:32 np0005591762 nova_compute[225313]: 2026-01-22 10:13:32.723 225317 DEBUG nova.compute.manager [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 05:13:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:32 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:32 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:33 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:33 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:33 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:33.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:33 np0005591762 nova_compute[225313]: 2026-01-22 10:13:33.734 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:33 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:33 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:34 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:34 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:34 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:34.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:34 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:34 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:35 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:35 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:35 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:35.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.588 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.722 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.744 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.745 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.745 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 05:13:35 np0005591762 nova_compute[225313]: 2026-01-22 10:13:35.745 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:13:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:35 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:35 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:36 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:36 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:36 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:36.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:13:36 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3153014142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.129 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.372 225317 WARNING nova.virt.libvirt.driver [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.374 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4844MB free_disk=59.988277435302734GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.374 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.374 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:13:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:13:36 np0005591762 ceph-mon[75519]: from='mgr.14664 ' entity='mgr.compute-0.rfmoog' 
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.468 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.469 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.536 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing inventories for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.591 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating ProviderTree inventory for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.591 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Updating inventory in ProviderTree for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.606 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing aggregate associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.625 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Refreshing trait associations for resource provider 15be1e53-1c88-43bb-b33e-cd7166bd9713, traits: HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX512VAES,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AESNI,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.638 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 05:13:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:36 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:36 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:36 np0005591762 nova_compute[225313]: 2026-01-22 10:13:36.954 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:36 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 22 05:13:36 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4081655226' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 22 05:13:37 np0005591762 nova_compute[225313]: 2026-01-22 10:13:37.003 225317 DEBUG oslo_concurrency.processutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.366s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 05:13:37 np0005591762 nova_compute[225313]: 2026-01-22 10:13:37.008 225317 DEBUG nova.compute.provider_tree [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed in ProviderTree for provider: 15be1e53-1c88-43bb-b33e-cd7166bd9713 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 05:13:37 np0005591762 nova_compute[225313]: 2026-01-22 10:13:37.022 225317 DEBUG nova.scheduler.client.report [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Inventory has not changed for provider 15be1e53-1c88-43bb-b33e-cd7166bd9713 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 05:13:37 np0005591762 nova_compute[225313]: 2026-01-22 10:13:37.023 225317 DEBUG nova.compute.resource_tracker [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 05:13:37 np0005591762 nova_compute[225313]: 2026-01-22 10:13:37.023 225317 DEBUG oslo_concurrency.lockutils [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:13:37 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:37 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:37 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:37 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:37.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:37 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:37 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:38 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:38 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:38 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:38.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:38 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:38 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:38 np0005591762 podman[248739]: 2026-01-22 10:13:38.834069356 +0000 UTC m=+0.045850637 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 05:13:39 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:39 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:39 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:39.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:39 np0005591762 nova_compute[225313]: 2026-01-22 10:13:39.723 225317 DEBUG oslo_service.periodic_task [None req-266ca404-22ed-4c4a-adc5-aea9ecae0440 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 05:13:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:39 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:39 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:40 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:40 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:40 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:40.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:40 np0005591762 nova_compute[225313]: 2026-01-22 10:13:40.591 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:40 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:40 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:41 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:41 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:41 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:41.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:41 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:41 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:41 np0005591762 nova_compute[225313]: 2026-01-22 10:13:41.956 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:42 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:42 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:42 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:42 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:42.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:42 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:42 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:42 np0005591762 podman[248760]: 2026-01-22 10:13:42.844379402 +0000 UTC m=+0.064177332 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 05:13:43 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:43 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:43 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:43.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:43 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:43 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:44 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:44 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:44 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:44.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:44 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:44 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:45 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:45 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:45 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:45.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:45 np0005591762 nova_compute[225313]: 2026-01-22 10:13:45.591 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:45 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:45 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:46 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:46 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:46 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:46.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:46 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:46 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:46 np0005591762 nova_compute[225313]: 2026-01-22 10:13:46.958 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:47 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:47 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:47 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:47 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:47.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:13:47.218 143150 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 05:13:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:13:47.219 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 05:13:47 np0005591762 ovn_metadata_agent[143145]: 2026-01-22 10:13:47.219 143150 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 05:13:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:47 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:47 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:48 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:48 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:48 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:48.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:48 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:48 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:49 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:49 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:49 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:49.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:49 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:49 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:50 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:50 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:50 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:50.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:50 np0005591762 nova_compute[225313]: 2026-01-22 10:13:50.593 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:50 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:50 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:51 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:51 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:51 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:51.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:13:51 np0005591762 ceph-mon[75519]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 7234 writes, 37K keys, 7234 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 7234 writes, 7234 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1550 writes, 7871 keys, 1550 commit groups, 1.0 writes per commit group, ingest: 17.33 MB, 0.03 MB/s#012Interval WAL: 1550 writes, 1550 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    426.2      0.13              0.09        20    0.007       0      0       0.0       0.0#012  L6      1/0   13.68 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.5    505.7    432.1      0.58              0.40        19    0.031    110K    11K       0.0       0.0#012 Sum      1/0   13.68 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.5    411.9    431.0      0.72              0.50        39    0.018    110K    11K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.3    410.6    411.7      0.20              0.14        10    0.020     35K   3589       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    505.7    432.1      0.58              0.40        19    0.031    110K    11K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    430.0      0.13              0.09        19    0.007       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      1.3      0.00              0.00         1    0.001       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.055, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.30 GB write, 0.13 MB/s write, 0.29 GB read, 0.12 MB/s read, 0.7 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a025f49350#2 capacity: 304.00 MB usage: 26.20 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000256 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1599,25.36 MB,8.3408%) FilterBlock(39,310.92 KB,0.0998798%) IndexBlock(39,553.08 KB,0.17767%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 22 05:13:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:51 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:51 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:51 np0005591762 nova_compute[225313]: 2026-01-22 10:13:51.962 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:52 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:52 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:52 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:52 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:52.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:52 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:52 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:53 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:53 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:53 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:53.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:53 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:53 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:54 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:54 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:54 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:54.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:54 np0005591762 systemd-logind[744]: New session 57 of user zuul.
Jan 22 05:13:54 np0005591762 systemd[1]: Started Session 57 of User zuul.
Jan 22 05:13:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:54 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:54 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:55 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:55 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:55 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:55.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:55 np0005591762 nova_compute[225313]: 2026-01-22 10:13:55.595 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:55 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:55 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:56 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:56 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:13:56 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:56.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:13:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:56 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:56 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:56 np0005591762 nova_compute[225313]: 2026-01-22 10:13:56.964 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:13:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:13:57 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 22 05:13:57 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4099958387' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 22 05:13:57 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:57 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:57 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:57.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:57 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:57 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:58 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:58 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:13:58 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:13:58.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:13:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:58 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:58 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:59 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:13:59 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:13:59 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:13:59.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:13:59 np0005591762 ovs-vsctl[249142]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 05:13:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:13:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:13:59 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:13:59 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:00 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:00 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:14:00 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:00.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:14:00 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 05:14:00 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 05:14:00 np0005591762 virtqemud[225050]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 05:14:00 np0005591762 nova_compute[225313]: 2026-01-22 10:14:00.599 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:00 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: cache status {prefix=cache status} (starting...)
Jan 22 05:14:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:00 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:00 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:00 np0005591762 lvm[249433]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 22 05:14:00 np0005591762 lvm[249433]: VG ceph_vg0 finished
Jan 22 05:14:00 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: client ls {prefix=client ls} (starting...)
Jan 22 05:14:01 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:01 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:14:01 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:01.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:14:01 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: damage ls {prefix=damage ls} (starting...)
Jan 22 05:14:01 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump loads {prefix=dump loads} (starting...)
Jan 22 05:14:01 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 22 05:14:01 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 22 05:14:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 22 05:14:01 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/714730537' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 22 05:14:01 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 22 05:14:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:01 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:01 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:01 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 22 05:14:01 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1059030614' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 22 05:14:01 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 22 05:14:01 np0005591762 nova_compute[225313]: 2026-01-22 10:14:01.965 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:14:02 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 22 05:14:02 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:02 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:02 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:02.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1303172620' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 22 05:14:02 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 22 05:14:02 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: ops {prefix=ops} (starting...)
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1772800550' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2099731774' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 22 05:14:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:02 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:02 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 22 05:14:02 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/866339703' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 22 05:14:03 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: session ls {prefix=session ls} (starting...)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2326891902' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 22 05:14:03 np0005591762 ceph-mds[84734]: mds.cephfs.compute-2.zwrmjl asok_command: status {prefix=status} (starting...)
Jan 22 05:14:03 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:03 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:14:03 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:03.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/626081672' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1853695200' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2486604803' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 22 05:14:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:03 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:03 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4128229209' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 22 05:14:03 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2593010989' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4114120696' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2337878032' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 22 05:14:04 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:04 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:04 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:04.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1205542223' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3561887836' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 22 05:14:04 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/736355942' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 22 05:14:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:04 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:04 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 22 05:14:05 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2414072814' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 22 05:14:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 22 05:14:05 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/899940860' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 22 05:14:05 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:05 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:05 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:05.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:05 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 22 05:14:05 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1255706265' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 22 05:14:05 np0005591762 nova_compute[225313]: 2026-01-22 10:14:05.599 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:05 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:05 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:06 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:06 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:06 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:06.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 1163264 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 1163264 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874854 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 1146880 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 1146880 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.621748924s of 14.622872353s, submitted: 1
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73269248 unmapped: 1138688 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 874986 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 1114112 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73302016 unmapped: 1105920 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 1089536 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876514 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73334784 unmapped: 1073152 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.005198479s of 12.014612198s, submitted: 11
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875316 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c0c00 session 0x55bb15561a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875184 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875184 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.628890991s of 13.631286621s, submitted: 2
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875316 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876844 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876237 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.120885849s of 17.131437302s, submitted: 12
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73547776 unmapped: 860160 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73547776 unmapped: 860160 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73547776 unmapped: 860160 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73572352 unmapped: 835584 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 819200 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 819200 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 811008 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 811008 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 802816 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 802816 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 802816 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 770048 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 761856 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 761856 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 696320 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 671744 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 663552 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c3c00 session 0x55bb146aaf00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 647168 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 647168 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 647168 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 614400 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 606208 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 606208 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876105 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.835266113s of 55.836685181s, submitted: 1
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 606208 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 598016 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73818112 unmapped: 589824 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 877765 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 507904 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878518 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 507904 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 507904 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 491520 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 491520 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 483328 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878670 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 483328 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.329652786s of 16.341135025s, submitted: 13
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 74989568 unmapped: 466944 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75014144 unmapped: 442368 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75014144 unmapped: 442368 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75022336 unmapped: 434176 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75022336 unmapped: 434176 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75030528 unmapped: 425984 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75038720 unmapped: 417792 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75046912 unmapped: 409600 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75055104 unmapped: 401408 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75055104 unmapped: 401408 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75055104 unmapped: 401408 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75063296 unmapped: 393216 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75063296 unmapped: 393216 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75063296 unmapped: 393216 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75071488 unmapped: 385024 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75087872 unmapped: 368640 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75096064 unmapped: 360448 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75096064 unmapped: 360448 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e30400 session 0x55bb1691bc20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75096064 unmapped: 360448 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75104256 unmapped: 352256 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75104256 unmapped: 352256 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5918 writes, 25K keys, 5918 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5918 writes, 1040 syncs, 5.69 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5918 writes, 25K keys, 5918 commit groups, 1.0 writes per commit group, ingest: 19.22 MB, 0.03 MB/s#012Interval WAL: 5918 writes, 1040 syncs, 5.69 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75210752 unmapped: 245760 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878538 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.951286316s of 33.952785492s, submitted: 1
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75227136 unmapped: 229376 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75235328 unmapped: 221184 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75251712 unmapped: 204800 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75284480 unmapped: 172032 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75292672 unmapped: 163840 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880198 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75292672 unmapped: 163840 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75317248 unmapped: 139264 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75325440 unmapped: 131072 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75325440 unmapped: 131072 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75350016 unmapped: 106496 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879439 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75358208 unmapped: 98304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75358208 unmapped: 98304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.006448746s of 12.019389153s, submitted: 12
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75382784 unmapped: 73728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75382784 unmapped: 73728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75382784 unmapped: 73728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879000 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75390976 unmapped: 65536 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75407360 unmapped: 49152 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75415552 unmapped: 40960 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75415552 unmapped: 40960 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75423744 unmapped: 32768 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75431936 unmapped: 24576 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1015808 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 688128 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75825152 unmapped: 679936 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 671744 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 663552 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 663552 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 663552 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 655360 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 655360 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 647168 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 647168 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 638976 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 638976 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16ad9400 session 0x55bb155612c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 638976 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 630784 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.670803070s of 32.730369568s, submitted: 84
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 77168640 unmapped: 1433600 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878868 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 180224 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879000 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 172032 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.134506226s of 12.238805771s, submitted: 166
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78454784 unmapped: 147456 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78454784 unmapped: 147456 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78454784 unmapped: 147456 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880528 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 131072 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 131072 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 122880 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 122880 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78487552 unmapped: 114688 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879769 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 106496 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 106496 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78503936 unmapped: 98304 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78503936 unmapped: 98304 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 90112 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879921 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.997126579s of 13.009223938s, submitted: 9
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 90112 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb14672c00 session 0x55bb172f65a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 90112 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78520320 unmapped: 81920 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 65536 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78544896 unmapped: 57344 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879789 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78544896 unmapped: 57344 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78544896 unmapped: 57344 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78553088 unmapped: 49152 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78553088 unmapped: 49152 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 40960 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879789 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 40960 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 32768 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.138556480s of 12.139929771s, submitted: 1
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78577664 unmapped: 24576 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78577664 unmapped: 24576 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78585856 unmapped: 16384 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879921 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78610432 unmapped: 1040384 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78610432 unmapped: 1040384 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 1032192 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78635008 unmapped: 1015808 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78643200 unmapped: 1007616 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882961 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78651392 unmapped: 999424 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78659584 unmapped: 991232 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78667776 unmapped: 983040 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78667776 unmapped: 983040 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78675968 unmapped: 974848 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882354 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78684160 unmapped: 966656 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78684160 unmapped: 966656 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb1581c400 session 0x55bb147a3680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78692352 unmapped: 958464 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78692352 unmapped: 958464 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.831901550s of 16.840600967s, submitted: 12
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78700544 unmapped: 950272 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882222 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78700544 unmapped: 950272 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 942080 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 942080 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 942080 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 22 05:14:06 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/400058622' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78725120 unmapped: 925696 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882222 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78725120 unmapped: 925696 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78733312 unmapped: 917504 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78733312 unmapped: 917504 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78749696 unmapped: 901120 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882354 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.236344337s of 12.239408493s, submitted: 3
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 860160 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb1466e400 session 0x55bb147a30e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 827392 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885394 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885394 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78848000 unmapped: 802816 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78848000 unmapped: 802816 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.904095650s of 10.914563179s, submitted: 9
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884787 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [1])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884803 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.533043861s of 11.542860031s, submitted: 10
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884044 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883605 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78905344 unmapped: 745472 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883473 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c7800 session 0x55bb168632c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169c0400 session 0x55bb1641c000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883473 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883473 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.116767883s of 25.120988846s, submitted: 4
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883737 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 712704 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885265 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 663552 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885097 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.843386650s of 15.856459618s, submitted: 13
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884985 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e31000 session 0x55bb166e8000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884985 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884985 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.390679359s of 11.392770767s, submitted: 2
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 1687552 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 1687552 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 1687552 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 1679360 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886645 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e32c00 session 0x55bb146ab4a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 1679360 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 1679360 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886645 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.008318901s of 12.015848160s, submitted: 9
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886038 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 1654784 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 1638400 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 1638400 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887566 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 1638400 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 1613824 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 1613824 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 1613824 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.801233292s of 11.814348221s, submitted: 12
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889078 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888471 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 1605632 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 1597440 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 1589248 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb169bd800 session 0x55bb1796e5a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888339 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 84.316741943s of 84.330902100s, submitted: 3
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888471 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889999 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79183872 unmapped: 1515520 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79200256 unmapped: 1499136 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79200256 unmapped: 1499136 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890752 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.004010201s of 12.014162064s, submitted: 12
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79224832 unmapped: 1474560 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890313 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79249408 unmapped: 1449984 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 1441792 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 1441792 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79323136 unmapped: 1376256 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79339520 unmapped: 1359872 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79347712 unmapped: 1351680 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79364096 unmapped: 1335296 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e2f800 session 0x55bb146aba40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 ms_handle_reset con 0x55bb16e30c00 session 0x55bb146aa780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890181 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 110.737770081s of 110.740150452s, submitted: 2
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79405056 unmapped: 1294336 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890461 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79437824 unmapped: 1261568 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79437824 unmapped: 1261568 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79437824 unmapped: 1261568 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79454208 unmapped: 1245184 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79470592 unmapped: 1228800 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890461 data_alloc: 218103808 data_used: 4096
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.267648697s of 10.278943062s, submitted: 11
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79478784 unmapped: 1220608 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79486976 unmapped: 1212416 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889263 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca7c000/0x0/0x4ffc00000, data 0xf69ff/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888999 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf8af3/0x1a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fbe03000/0x0/0x4ffc00000, data 0xd6ac61/0xe18000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 17768448 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.349076271s of 10.382431030s, submitted: 31
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 139 ms_handle_reset con 0x55bb166d8000 session 0x55bb179d8780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 17768448 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79839232 unmapped: 17645568 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 140 ms_handle_reset con 0x55bb166d9c00 session 0x55bb179952c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79847424 unmapped: 17637376 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1071604 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79855616 unmapped: 17629184 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79855616 unmapped: 17629184 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fb18c000/0x0/0x4ffc00000, data 0x19dee94/0x1a8f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79872000 unmapped: 17612800 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fb18c000/0x0/0x4ffc00000, data 0x19dee94/0x1a8f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79872000 unmapped: 17612800 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074530 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074530 data_alloc: 218103808 data_used: 8192
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79880192 unmapped: 17604608 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fb189000/0x0/0x4ffc00000, data 0x19e0e66/0x1a92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 79888384 unmapped: 17596416 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1074682 data_alloc: 218103808 data_used: 12288
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 ms_handle_reset con 0x55bb166d6000 session 0x55bb152c1e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 ms_handle_reset con 0x55bb169c5c00 session 0x55bb155614a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 91545600 unmapped: 5939200 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.163917542s of 39.186256409s, submitted: 33
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 91545600 unmapped: 5939200 heap: 97484800 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb14673c00 session 0x55bb172f7680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d6000 session 0x55bb169230e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d8000 session 0x55bb16862000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d9c00 session 0x55bb179243c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb169c5c00 session 0x55bb152c01e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fb186000/0x0/0x4ffc00000, data 0x19e2f52/0x1a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb16c50c00 session 0x55bb163f3e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167932 data_alloc: 234881024 data_used: 11485184
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fac61000/0x0/0x4ffc00000, data 0x1f050f4/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb166d6000 session 0x55bb16526000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93331456 unmapped: 8486912 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb13e61400 session 0x55bb156f9c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 ms_handle_reset con 0x55bb1466fc00 session 0x55bb176cd4a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93052928 unmapped: 8765440 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 93069312 unmapped: 8749056 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201211 data_alloc: 234881024 data_used: 15548416
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac3a000/0x0/0x4ffc00000, data 0x1f2b0e9/0x1fe1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac3a000/0x0/0x4ffc00000, data 0x1f2b0e9/0x1fe1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201211 data_alloc: 234881024 data_used: 15548416
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 4710400 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.756557465s of 16.806152344s, submitted: 91
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 99557376 unmapped: 2260992 heap: 101818368 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103743488 unmapped: 4366336 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f91a3000/0x0/0x4ffc00000, data 0x28140e9/0x28ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f91a3000/0x0/0x4ffc00000, data 0x28140e9/0x28ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279961 data_alloc: 234881024 data_used: 15925248
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb14a99400 session 0x55bb17938f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9175000/0x0/0x4ffc00000, data 0x28510e9/0x2907000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9175000/0x0/0x4ffc00000, data 0x28510e9/0x2907000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31c00 session 0x55bb17939860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279145 data_alloc: 234881024 data_used: 15925248
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9173000/0x0/0x4ffc00000, data 0x28530e9/0x2909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9173000/0x0/0x4ffc00000, data 0x28530e9/0x2909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279145 data_alloc: 234881024 data_used: 15925248
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.602156639s of 12.680288315s, submitted: 136
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9171000/0x0/0x4ffc00000, data 0x28550e9/0x290b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9171000/0x0/0x4ffc00000, data 0x28550e9/0x290b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 103776256 unmapped: 4333568 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278989 data_alloc: 234881024 data_used: 15929344
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad8c00 session 0x55bb156f8960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6000 session 0x55bb156f90e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c1800 session 0x55bb1691a1e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb1691bc20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105324544 unmapped: 2785280 heap: 108109824 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be000 session 0x55bb1691a780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581cc00 session 0x55bb16526780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb16526f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be000 session 0x55bb163985a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c1800 session 0x55bb16399c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6000 session 0x55bb16398b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb16399a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3f000/0x0/0x4ffc00000, data 0x2e860f9/0x2f3d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1332237 data_alloc: 234881024 data_used: 16453632
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33400 session 0x55bb16394b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.629212379s of 10.656615257s, submitted: 21
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4c00 session 0x55bb16395e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 6971392 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb1647cf00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c3800 session 0x55bb172f70e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3e000/0x0/0x4ffc00000, data 0x2e86109/0x2f3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105488384 unmapped: 6955008 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 105504768 unmapped: 6938624 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336136 data_alloc: 234881024 data_used: 16490496
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3e000/0x0/0x4ffc00000, data 0x2e86109/0x2f3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110829568 unmapped: 1613824 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 1572864 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1379372 data_alloc: 234881024 data_used: 22847488
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 1572864 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.993770599s of 10.007340431s, submitted: 18
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b3c000/0x0/0x4ffc00000, data 0x2e87109/0x2f3f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110903296 unmapped: 1540096 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110903296 unmapped: 1540096 heap: 112443392 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4292608 heap: 117686272 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118677504 unmapped: 3211264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1520512 data_alloc: 234881024 data_used: 23384064
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118677504 unmapped: 3211264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118677504 unmapped: 3211264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7643000/0x0/0x4ffc00000, data 0x3f69109/0x4021000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 3178496 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 3178496 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 3178496 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1520512 data_alloc: 234881024 data_used: 23384064
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 4841472 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7648000/0x0/0x4ffc00000, data 0x3f6c109/0x4024000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e800 session 0x55bb179f2780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.464450836s of 10.549235344s, submitted: 131
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb165472c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 4833280 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16a5b800 session 0x55bb1691a5a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 10584064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 10584064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8d60000/0x0/0x4ffc00000, data 0x28560e9/0x290c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 10584064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296941 data_alloc: 234881024 data_used: 16453632
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51c00 session 0x55bb176cc5a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d8400 session 0x55bb1796e780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109182976 unmapped: 12705792 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33c00 session 0x55bb1691a1e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148353 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb147a21e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148353 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148353 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.459398270s of 20.503299713s, submitted: 84
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bcf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109256704 unmapped: 12632064 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147901 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 12599296 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb146b1a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31800 session 0x55bb172f70e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108806144 unmapped: 13082624 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108806144 unmapped: 13082624 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108814336 unmapped: 13074432 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb146aa3c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108257280 unmapped: 13631488 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1191849 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f989a000/0x0/0x4ffc00000, data 0x1d1d0c6/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f989a000/0x0/0x4ffc00000, data 0x1d1d0c6/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.978419304s of 12.022029877s, submitted: 57
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1204922 data_alloc: 234881024 data_used: 13922304
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f989a000/0x0/0x4ffc00000, data 0x1d1d0c6/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 108142592 unmapped: 13746176 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 114130944 unmapped: 7757824 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276110 data_alloc: 234881024 data_used: 14196736
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8fa4000/0x0/0x4ffc00000, data 0x26050c6/0x26ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 10461184 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284076 data_alloc: 234881024 data_used: 14118912
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.410071373s of 12.487000465s, submitted: 130
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 10747904 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284092 data_alloc: 234881024 data_used: 14118912
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284092 data_alloc: 234881024 data_used: 14118912
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 10739712 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f88000/0x0/0x4ffc00000, data 0x26260c6/0x26db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 10731520 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284244 data_alloc: 234881024 data_used: 14123008
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.180794716s of 14.182448387s, submitted: 1
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb172f6960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb13e825a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111165440 unmapped: 10723328 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb146b1c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109461504 unmapped: 12427264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 7997 writes, 31K keys, 7997 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7997 writes, 1974 syncs, 4.05 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2079 writes, 6139 keys, 2079 commit groups, 1.0 writes per commit group, ingest: 6.38 MB, 0.01 MB/s#012Interval WAL: 2079 writes, 934 syncs, 2.23 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb12ec3350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memta
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109461504 unmapped: 12427264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109461504 unmapped: 12427264 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 12419072 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 12410880 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb17938780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6800 session 0x55bb1691a960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb17938f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb16922000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 12402688 heap: 121888768 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb147a2960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb163f2f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bfc00 session 0x55bb156f9c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.919490814s of 19.946268082s, submitted: 38
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb176cc780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e36000/0x0/0x4ffc00000, data 0x2782064/0x2836000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27140096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266128 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb17925a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 27901952 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118661120 unmapped: 19718144 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118661120 unmapped: 19718144 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118693888 unmapped: 19685376 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118693888 unmapped: 19685376 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1354209 data_alloc: 234881024 data_used: 24870912
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118726656 unmapped: 19652608 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.768112183s of 10.846858978s, submitted: 112
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 19644416 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 19644416 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8e35000/0x0/0x4ffc00000, data 0x2782087/0x2837000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 19644416 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 19628032 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355737 data_alloc: 234881024 data_used: 24866816
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 123494400 unmapped: 14884864 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f832a000/0x0/0x4ffc00000, data 0x328d087/0x3342000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125542400 unmapped: 12836864 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8268000/0x0/0x4ffc00000, data 0x3346087/0x33fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125575168 unmapped: 12804096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8268000/0x0/0x4ffc00000, data 0x3346087/0x33fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1461459 data_alloc: 234881024 data_used: 25677824
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.644487381s of 10.730033875s, submitted: 144
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124919808 unmapped: 13459456 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f824d000/0x0/0x4ffc00000, data 0x336a087/0x341f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1454048 data_alloc: 234881024 data_used: 25743360
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f824d000/0x0/0x4ffc00000, data 0x336a087/0x341f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 13443072 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 13418496 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 13352960 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1454056 data_alloc: 234881024 data_used: 25743360
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6000 session 0x55bb17939c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51400 session 0x55bb172f6000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125575168 unmapped: 12804096 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7854000/0x0/0x4ffc00000, data 0x3d620e9/0x3e18000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0800 session 0x55bb17999a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125607936 unmapped: 12771328 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0800 session 0x55bb17924960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7854000/0x0/0x4ffc00000, data 0x3d620e9/0x3e18000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e000 session 0x55bb16395a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.622238159s of 10.764037132s, submitted: 213
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6000 session 0x55bb1796f0e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 12713984 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 12689408 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133931008 unmapped: 4448256 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1600375 data_alloc: 251658240 data_used: 34828288
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133931008 unmapped: 4448256 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133931008 unmapped: 4448256 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7850000/0x0/0x4ffc00000, data 0x3d6510c/0x3e1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b8800 session 0x55bb17998f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1600751 data_alloc: 251658240 data_used: 34832384
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133963776 unmapped: 4415488 heap: 138379264 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7850000/0x0/0x4ffc00000, data 0x3d6510c/0x3e1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.880483627s of 10.891222000s, submitted: 13
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137666560 unmapped: 5226496 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f68c8000/0x0/0x4ffc00000, data 0x4ce710c/0x4d9e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138551296 unmapped: 4341760 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1734917 data_alloc: 251658240 data_used: 36098048
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138633216 unmapped: 4259840 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f000 session 0x55bb1796ef00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138633216 unmapped: 4259840 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f682c000/0x0/0x4ffc00000, data 0x4d7a10c/0x4e31000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 4947968 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 4939776 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 4939776 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1728837 data_alloc: 251658240 data_used: 36098048
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 4964352 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f681b000/0x0/0x4ffc00000, data 0x4d9a10c/0x4e51000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 4964352 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb16394b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51400 session 0x55bb176cd2c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130326528 unmapped: 12566528 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839800 session 0x55bb149b94a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 12525568 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f823e000/0x0/0x4ffc00000, data 0x3378087/0x342d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 12525568 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1470835 data_alloc: 234881024 data_used: 25735168
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 12443648 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d400 session 0x55bb1796e5a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.900511742s of 13.028066635s, submitted: 225
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0400 session 0x55bb176cd0e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120381440 unmapped: 22511616 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120381440 unmapped: 22511616 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120381440 unmapped: 22511616 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 22462464 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193777 data_alloc: 234881024 data_used: 12009472
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 22462464 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 22462464 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195289 data_alloc: 234881024 data_used: 12009472
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.551191330s of 11.578843117s, submitted: 39
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195141 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9bd1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 22421504 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 120479744 unmapped: 22413312 heap: 142893056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195009 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c2400 session 0x55bb17924b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad9400 session 0x55bb16526780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb163f3e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51000 session 0x55bb179d9a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9400 session 0x55bb146ad680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb17924d20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f000 session 0x55bb179990e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c2400 session 0x55bb179d85a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad9400 session 0x55bb17360960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 38436864 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51000 session 0x55bb17924f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.872282982s of 10.916283607s, submitted: 41
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb179241e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899e000/0x0/0x4ffc00000, data 0x2c190c6/0x2cce000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 121348096 unmapped: 38412288 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1332015 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129728512 unmapped: 30031872 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1459543 data_alloc: 251658240 data_used: 30892032
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129761280 unmapped: 29999104 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129794048 unmapped: 29966336 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129802240 unmapped: 29958144 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129802240 unmapped: 29958144 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.577187538s of 10.583614349s, submitted: 8
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [2])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137314304 unmapped: 22446080 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1542929 data_alloc: 251658240 data_used: 31178752
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f899d000/0x0/0x4ffc00000, data 0x2c190d6/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137576448 unmapped: 22183936 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e31000/0x0/0x4ffc00000, data 0x37850d6/0x383b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e31000/0x0/0x4ffc00000, data 0x37850d6/0x383b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1556209 data_alloc: 251658240 data_used: 32387072
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 20889600 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e12000/0x0/0x4ffc00000, data 0x37a40d6/0x385a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1555001 data_alloc: 251658240 data_used: 32391168
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e12000/0x0/0x4ffc00000, data 0x37a40d6/0x385a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0400 session 0x55bb17995e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139001856 unmapped: 20758528 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.999485970s of 12.082794189s, submitted: 137
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139108352 unmapped: 20652032 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139108352 unmapped: 20652032 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139108352 unmapped: 20652032 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e05000/0x0/0x4ffc00000, data 0x37b10d6/0x3867000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1555305 data_alloc: 251658240 data_used: 32391168
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e05000/0x0/0x4ffc00000, data 0x37b10d6/0x3867000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1556037 data_alloc: 251658240 data_used: 32399360
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7e00000/0x0/0x4ffc00000, data 0x37b60d6/0x386c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 20643840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.517280579s of 10.523816109s, submitted: 7
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139239424 unmapped: 20520960 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb17a1c400 session 0x55bb179952c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4c00 session 0x55bb179d9680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd000 session 0x55bb163f3680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7df2000/0x0/0x4ffc00000, data 0x37c40d6/0x387a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb13e82f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c0400 session 0x55bb13e82960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4c00 session 0x55bb179250e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb17a1c400 session 0x55bb179254a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4800 session 0x55bb176cd860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4800 session 0x55bb163985a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139501568 unmapped: 20258816 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139501568 unmapped: 20258816 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1622344 data_alloc: 251658240 data_used: 32395264
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7592000/0x0/0x4ffc00000, data 0x4022148/0x40da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139616256 unmapped: 20144128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139616256 unmapped: 20144128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be800 session 0x55bb165270e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c1400 session 0x55bb176cc780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139616256 unmapped: 20144128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bac00 session 0x55bb17925a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33c00 session 0x55bb176ccf00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139632640 unmapped: 20127744 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144302080 unmapped: 15458304 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1668826 data_alloc: 251658240 data_used: 37187584
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7591000/0x0/0x4ffc00000, data 0x402216b/0x40db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f758f000/0x0/0x4ffc00000, data 0x402316b/0x40dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.890444756s of 11.935307503s, submitted: 51
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1677218 data_alloc: 251658240 data_used: 37236736
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144547840 unmapped: 15212544 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 144441344 unmapped: 15319040 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f758c000/0x0/0x4ffc00000, data 0x402716b/0x40e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147808256 unmapped: 11952128 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1772648 data_alloc: 251658240 data_used: 37298176
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f6573000/0x0/0x4ffc00000, data 0x4c2f16b/0x4ce8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148078592 unmapped: 11681792 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.307511330s of 10.372504234s, submitted: 103
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147324928 unmapped: 12435456 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1768056 data_alloc: 251658240 data_used: 37302272
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147324928 unmapped: 12435456 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f6572000/0x0/0x4ffc00000, data 0x4c3116b/0x4cea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1768432 data_alloc: 251658240 data_used: 37302272
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f656d000/0x0/0x4ffc00000, data 0x4c3616b/0x4cef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147357696 unmapped: 12402688 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f656c000/0x0/0x4ffc00000, data 0x4c3716b/0x4cf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 147365888 unmapped: 12394496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e33c00 session 0x55bb149b8960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bac00 session 0x55bb146ac780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f6569000/0x0/0x4ffc00000, data 0x4c3a16b/0x4cf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bf800 session 0x55bb16922f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1570754 data_alloc: 251658240 data_used: 29581312
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f79ce000/0x0/0x4ffc00000, data 0x37d60d6/0x388c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f79ce000/0x0/0x4ffc00000, data 0x37d60d6/0x388c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.750416756s of 13.788337708s, submitted: 53
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e800 session 0x55bb16398d20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f000 session 0x55bb13e83a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f79cd000/0x0/0x4ffc00000, data 0x37d90d6/0x388f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141869056 unmapped: 17891328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1570022 data_alloc: 251658240 data_used: 29581312
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bac00 session 0x55bb179385a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224037 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224037 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c0000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224037 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128901120 unmapped: 30859264 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.484128952s of 19.514802933s, submitted: 43
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bd800 session 0x55bb1a07d860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282737 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9187000/0x0/0x4ffc00000, data 0x2021064/0x20d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128253952 unmapped: 31506432 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16a5a800 session 0x55bb152c1860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128245760 unmapped: 31514624 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9163000/0x0/0x4ffc00000, data 0x2045064/0x20f9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319037 data_alloc: 234881024 data_used: 17121280
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 29401088 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb17995680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d9c00 session 0x55bb179d83c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 29917184 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb17994f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231281 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97c1000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231281 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127082496 unmapped: 32677888 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.315660477s of 20.341112137s, submitted: 34
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f800 session 0x55bb146ab2c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b9400 session 0x55bb17360b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f924f000/0x0/0x4ffc00000, data 0x1f580c6/0x200d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280686 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127533056 unmapped: 32227328 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c9d800 session 0x55bb176ccf00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 127565824 unmapped: 32194560 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f924e000/0x0/0x4ffc00000, data 0x1f580e9/0x200e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 30883840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320451 data_alloc: 234881024 data_used: 17633280
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 30883840 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b8c00 session 0x55bb176cd860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb179383c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1238459 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1238459 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f000 session 0x55bb179f3680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1238459 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97bf000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 32874496 heap: 159760384 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1783f800 session 0x55bb179f32c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30400 session 0x55bb179f3860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c51400 session 0x55bb179f23c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb179f3e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.602361679s of 26.647577286s, submitted: 61
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f000 session 0x55bb179f2000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30400 session 0x55bb13e82960
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1783f800 session 0x55bb13e830e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e000 session 0x55bb146aad20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466f800 session 0x55bb163f3a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352110 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16a5ac00 session 0x55bb179994a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a49000/0x0/0x4ffc00000, data 0x275d0d6/0x2813000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5800 session 0x55bb147a21e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 34537472 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb16527860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31000 session 0x55bb17939c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128393216 unmapped: 34521088 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 128393216 unmapped: 34521088 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355610 data_alloc: 234881024 data_used: 12021760
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [2])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131874816 unmapped: 31039488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442402 data_alloc: 234881024 data_used: 24817664
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 131956736 unmapped: 30957568 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a47000/0x0/0x4ffc00000, data 0x275d109/0x2815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.114472389s of 14.163391113s, submitted: 61
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 140820480 unmapped: 22093824 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 140992512 unmapped: 21921792 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532334 data_alloc: 234881024 data_used: 25493504
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141041664 unmapped: 21872640 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141041664 unmapped: 21872640 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141041664 unmapped: 21872640 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7ece000/0x0/0x4ffc00000, data 0x32d0109/0x3388000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1534478 data_alloc: 234881024 data_used: 25710592
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7ed4000/0x0/0x4ffc00000, data 0x32d0109/0x3388000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1531874 data_alloc: 234881024 data_used: 25718784
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 21839872 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.299962044s of 13.376144409s, submitted: 148
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb163f2f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5800 session 0x55bb176cc5a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e400 session 0x55bb17998b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256161 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256161 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256161 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f953f000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 133431296 unmapped: 29483008 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb17995860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5400 session 0x55bb1a07d4a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb179254a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb163943c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.407205582s of 15.435142517s, submitted: 51
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c5800 session 0x55bb163f3680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2e400 session 0x55bb16863a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16c9cc00 session 0x55bb179954a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb152c1860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb13e82780
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f92a1000/0x0/0x4ffc00000, data 0x1f07064/0x1fbb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299362 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132251648 unmapped: 30662656 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb1691be00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f927d000/0x0/0x4ffc00000, data 0x1f2b064/0x1fdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1340147 data_alloc: 234881024 data_used: 17301504
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f927d000/0x0/0x4ffc00000, data 0x1f2b064/0x1fdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132382720 unmapped: 30531584 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1340147 data_alloc: 234881024 data_used: 17301504
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb179994a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bb000 session 0x55bb17998b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bb000 session 0x55bb17998f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb179f3860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.089474678s of 13.115797043s, submitted: 25
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb179f3e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb152c1e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb156f81e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1739e000 session 0x55bb164452c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb179d8d20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 132636672 unmapped: 30277632 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f927d000/0x0/0x4ffc00000, data 0x1f2b064/0x1fdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136314880 unmapped: 26599424 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb176cd860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134701056 unmapped: 28213248 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bb000 session 0x55bb16445680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb16547680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c4000 session 0x55bb179f2b40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8303000/0x0/0x4ffc00000, data 0x2ea30d6/0x2f59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134709248 unmapped: 28205056 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8301000/0x0/0x4ffc00000, data 0x2ea3109/0x2f5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134717440 unmapped: 28196864 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1477652 data_alloc: 234881024 data_used: 17694720
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134217728 unmapped: 28696576 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134217728 unmapped: 28696576 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134217728 unmapped: 28696576 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 28688384 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 28688384 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1506156 data_alloc: 234881024 data_used: 21958656
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82ff000/0x0/0x4ffc00000, data 0x2ea5109/0x2f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82ff000/0x0/0x4ffc00000, data 0x2ea5109/0x2f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 28680192 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.724095345s of 13.820690155s, submitted: 143
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 140476416 unmapped: 22437888 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1613132 data_alloc: 234881024 data_used: 22204416
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141082624 unmapped: 21831680 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141090816 unmapped: 21823488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624194 data_alloc: 234881024 data_used: 22216704
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141090816 unmapped: 21823488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141090816 unmapped: 21823488 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141123584 unmapped: 21790720 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141123584 unmapped: 21790720 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141123584 unmapped: 21790720 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624210 data_alloc: 234881024 data_used: 22216704
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141131776 unmapped: 21782528 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141131776 unmapped: 21782528 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141139968 unmapped: 21774336 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141139968 unmapped: 21774336 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141139968 unmapped: 21774336 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624210 data_alloc: 234881024 data_used: 22216704
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f63d6000/0x0/0x4ffc00000, data 0x3c25109/0x3cdd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.310153961s of 16.393232346s, submitted: 138
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb15839c00 session 0x55bb179f21e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb166d6800 session 0x55bb179943c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 141164544 unmapped: 21749760 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb179f2d20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7354000/0x0/0x4ffc00000, data 0x2969064/0x2a1d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 23379968 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1441971 data_alloc: 234881024 data_used: 17686528
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169b9800 session 0x55bb1a07d860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bf400 session 0x55bb1796f860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb13e61400 session 0x55bb176cc1e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f80ad000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286343 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f80ad000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286343 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 25690112 heap: 162914304 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30400 session 0x55bb155605a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169be800 session 0x55bb17995860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb156dd800 session 0x55bb155601e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e2f000 session 0x55bb16399e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f80ad000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.128541946s of 19.187244415s, submitted: 97
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb14672800 session 0x55bb163f23c0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb14a98c00 session 0x55bb1a1145a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16ad9c00 session 0x55bb17938f00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136527872 unmapped: 33210368 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386118 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d800 session 0x55bb16398d20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d800 session 0x55bb1a07c000
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f793e000/0x0/0x4ffc00000, data 0x26c9074/0x277e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f793e000/0x0/0x4ffc00000, data 0x26c9074/0x277e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f793e000/0x0/0x4ffc00000, data 0x26c9074/0x277e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1385894 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e31400 session 0x55bb1691b860
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169bc000 session 0x55bb173614a0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136511488 unmapped: 33226752 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb169c6400 session 0x55bb179f3a40
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb16923e00
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 136814592 unmapped: 32923648 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1483305 data_alloc: 234881024 data_used: 25530368
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142680064 unmapped: 27058176 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1483305 data_alloc: 234881024 data_used: 25530368
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7919000/0x0/0x4ffc00000, data 0x26ed084/0x27a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 142712832 unmapped: 27025408 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.360231400s of 16.395832062s, submitted: 35
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 145932288 unmapped: 23805952 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146391040 unmapped: 23347200 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146391040 unmapped: 23347200 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1553735 data_alloc: 234881024 data_used: 25776128
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f5fd5000/0x0/0x4ffc00000, data 0x2e72084/0x2f28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146440192 unmapped: 23298048 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1553735 data_alloc: 234881024 data_used: 25776128
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread fragmentation_score=0.000363 took=0.000023s
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f5fd5000/0x0/0x4ffc00000, data 0x2e72084/0x2f28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146448384 unmapped: 23289856 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1553735 data_alloc: 234881024 data_used: 25776128
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.504435539s of 14.553052902s, submitted: 89
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb16e30800 session 0x55bb176cd0e0
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1581d800 session 0x55bb17925680
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 146464768 unmapped: 23273472 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 ms_handle_reset con 0x55bb1466e400 session 0x55bb147a3c20
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 31727616 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 31719424 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 31711232 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 31711232 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 31703040 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138043392 unmapped: 31694848 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138051584 unmapped: 31686656 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 31678464 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138067968 unmapped: 31670272 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138076160 unmapped: 31662080 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config diff' '{prefix=config diff}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config show' '{prefix=config show}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter dump' '{prefix=counter dump}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137732096 unmapped: 32006144 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter schema' '{prefix=counter schema}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 32415744 heap: 169738240 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'log dump' '{prefix=log dump}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 148676608 unmapped: 32104448 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'perf dump' '{prefix=perf dump}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'perf schema' '{prefix=perf schema}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137428992 unmapped: 43352064 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137469952 unmapped: 43311104 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137469952 unmapped: 43311104 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137469952 unmapped: 43311104 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137486336 unmapped: 43294720 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137494528 unmapped: 43286528 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137494528 unmapped: 43286528 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137117696 unmapped: 43663360 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137117696 unmapped: 43663360 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137125888 unmapped: 43655168 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137125888 unmapped: 43655168 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137125888 unmapped: 43655168 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137125888 unmapped: 43655168 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137142272 unmapped: 43638784 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137158656 unmapped: 43622400 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137158656 unmapped: 43622400 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 43614208 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 43614208 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 43614208 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 43614208 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 43614208 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 43614208 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137175040 unmapped: 43606016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137199616 unmapped: 43581440 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137216000 unmapped: 43565056 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 43556864 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 43556864 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 43556864 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 43556864 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 43556864 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137224192 unmapped: 43556864 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137232384 unmapped: 43548672 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137232384 unmapped: 43548672 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137248768 unmapped: 43532288 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137256960 unmapped: 43524096 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137256960 unmapped: 43524096 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137265152 unmapped: 43515904 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137265152 unmapped: 43515904 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137265152 unmapped: 43515904 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137265152 unmapped: 43515904 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137265152 unmapped: 43515904 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137265152 unmapped: 43515904 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137281536 unmapped: 43499520 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137281536 unmapped: 43499520 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137289728 unmapped: 43491328 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137289728 unmapped: 43491328 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137289728 unmapped: 43491328 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137289728 unmapped: 43491328 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137289728 unmapped: 43491328 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137297920 unmapped: 43483136 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137306112 unmapped: 43474944 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137322496 unmapped: 43458560 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137330688 unmapped: 43450368 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137338880 unmapped: 43442176 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137338880 unmapped: 43442176 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137338880 unmapped: 43442176 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137347072 unmapped: 43433984 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 43425792 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 43425792 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 43425792 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 43425792 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137363456 unmapped: 43417600 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137379840 unmapped: 43401216 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137396224 unmapped: 43384832 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137404416 unmapped: 43376640 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137404416 unmapped: 43376640 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137404416 unmapped: 43376640 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137404416 unmapped: 43376640 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137404416 unmapped: 43376640 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137404416 unmapped: 43376640 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137412608 unmapped: 43368448 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137412608 unmapped: 43368448 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137412608 unmapped: 43368448 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137412608 unmapped: 43368448 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137420800 unmapped: 43360256 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137420800 unmapped: 43360256 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137420800 unmapped: 43360256 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 44K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 3600 syncs, 3.26 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3744 writes, 13K keys, 3744 commit groups, 1.0 writes per commit group, ingest: 15.38 MB, 0.03 MB/s#012Interval WAL: 3744 writes, 1626 syncs, 2.30 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137428992 unmapped: 43352064 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137428992 unmapped: 43352064 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137428992 unmapped: 43352064 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137437184 unmapped: 43343872 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137437184 unmapped: 43343872 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137445376 unmapped: 43335680 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137453568 unmapped: 43327488 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137461760 unmapped: 43319296 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137461760 unmapped: 43319296 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137469952 unmapped: 43311104 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137469952 unmapped: 43311104 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 253.694488525s of 253.721832275s, submitted: 47
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137469952 unmapped: 43311104 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137551872 unmapped: 43229184 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137560064 unmapped: 43220992 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137568256 unmapped: 43212800 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137568256 unmapped: 43212800 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137568256 unmapped: 43212800 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x6cdf9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137568256 unmapped: 43212800 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.417669296s of 23.477962494s, submitted: 82
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137576448 unmapped: 43204608 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137707520 unmapped: 43073536 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137822208 unmapped: 42958848 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: mgrc ms_handle_reset ms_handle_reset con 0x55bb156dc800
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1082790531
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1082790531,v1:192.168.122.100:6801/1082790531]
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: mgrc handle_mgr_configure stats_period=5
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 234881024 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137830400 unmapped: 42950656 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137838592 unmapped: 42942464 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137846784 unmapped: 42934272 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137854976 unmapped: 42926080 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137863168 unmapped: 42917888 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137871360 unmapped: 42909696 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137879552 unmapped: 42901504 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137887744 unmapped: 42893312 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137895936 unmapped: 42885120 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137904128 unmapped: 42876928 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137912320 unmapped: 42868736 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137920512 unmapped: 42860544 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137928704 unmapped: 42852352 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137936896 unmapped: 42844160 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137945088 unmapped: 42835968 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137953280 unmapped: 42827776 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138010624 unmapped: 42770432 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138018816 unmapped: 42762240 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138027008 unmapped: 42754048 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138035200 unmapped: 42745856 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137961472 unmapped: 42819584 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137969664 unmapped: 42811392 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137977856 unmapped: 42803200 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137986048 unmapped: 42795008 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137994240 unmapped: 42786816 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301620 data_alloc: 218103808 data_used: 12013568
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138002432 unmapped: 42778624 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config diff' '{prefix=config diff}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config show' '{prefix=config show}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter dump' '{prefix=counter dump}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter schema' '{prefix=counter schema}'
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 137797632 unmapped: 42983424 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: prioritycache tune_memory target: 4294967296 mapped: 138199040 unmapped: 42582016 heap: 180781056 old mem: 2845415832 new mem: 2845415832
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7071000/0x0/0x4ffc00000, data 0x19e7064/0x1a9b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x70ef9c6), peers [0,1] op hist [])
Jan 22 05:14:06 np0005591762 ceph-osd[77912]: do_command 'log dump' '{prefix=log dump}'
Jan 22 05:14:06 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 22 05:14:06 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2714801633' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 22 05:14:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:06 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:06 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:06 np0005591762 nova_compute[225313]: 2026-01-22 10:14:06.967 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:14:07 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:07 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000010s ======
Jan 22 05:14:07 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:07.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Jan 22 05:14:07 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 22 05:14:07 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2774580203' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 22 05:14:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:07 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:07 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:08 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:08 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:08 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:08.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 22 05:14:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/732893454' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 22 05:14:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:08 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:08 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:08 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 22 05:14:08 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/196039347' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3819632351' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 22 05:14:09 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:09 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:14:09 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:09.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/454887452' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2157970857' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1061712218' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 22 05:14:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:09 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:09 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2966956063' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 22 05:14:09 np0005591762 podman[251073]: 2026-01-22 10:14:09.870717181 +0000 UTC m=+0.088086665 container health_status 56a71f4838828bf2ee9babfd7b903d23e837d4aa5ee35e967235bd38e75c811a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 22 05:14:09 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1270663760' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 22 05:14:10 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:10 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:10 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:10.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2328285417' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3269786163' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2305592848' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 22 05:14:10 np0005591762 nova_compute[225313]: 2026-01-22 10:14:10.601 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3768690561' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 22 05:14:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:10 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:10 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 22 05:14:10 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4105894115' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 22 05:14:11 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:11 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:11 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:11.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:11 np0005591762 systemd[1]: Starting Hostname Service...
Jan 22 05:14:11 np0005591762 systemd[1]: Started Hostname Service.
Jan 22 05:14:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:11 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:11 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:11 np0005591762 nova_compute[225313]: 2026-01-22 10:14:11.969 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 22 05:14:12 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:12 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:12 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:12.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:12 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 22 05:14:12 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1249751071' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 22 05:14:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:12 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:12 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3405318944' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 22 05:14:13 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:13 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:13 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:13.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/187143497' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 22 05:14:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:13 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:13 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 22 05:14:13 np0005591762 podman[251831]: 2026-01-22 10:14:13.898788604 +0000 UTC m=+0.119031698 container health_status 03227d683ebe41ac73a35a7749cbf21a3b66208f1952b2cc49016dab62409083 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3fe82618a1e232724f6de40ae7476ca4639ac3a88c6a67055315a726c890e06f-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9-0c504ff58863274e7b5c5b350a517d50c9d55aea2e6b509e780592da74b610f9'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 22 05:14:13 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 22 05:14:14 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:14 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 22 05:14:14 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.102 - anonymous [22/Jan/2026:10:14:14.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 22 05:14:14 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 22 05:14:14 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/42517039' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 22 05:14:14 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 22 05:14:14 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3078252817' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 22 05:14:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:14 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:14 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:15 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 22 05:14:15 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1464737773' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 22 05:14:15 np0005591762 radosgw[81024]: ====== starting new request req=0x7fd98178d5d0 =====
Jan 22 05:14:15 np0005591762 radosgw[81024]: ====== req done req=0x7fd98178d5d0 op status=0 http_status=200 latency=0.001000011s ======
Jan 22 05:14:15 np0005591762 radosgw[81024]: beast: 0x7fd98178d5d0: 192.168.122.100 - anonymous [22/Jan/2026:10:14:15.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000011s
Jan 22 05:14:15 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 22 05:14:15 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2767191961' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 22 05:14:15 np0005591762 ceph-mon[75519]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 22 05:14:15 np0005591762 ceph-mon[75519]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1167492908' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 22 05:14:15 np0005591762 nova_compute[225313]: 2026-01-22 10:14:15.602 225317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 05:14:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-rgw-default-compute-2-udkjbg[85416]: Thu Jan 22 10:14:15 2026: (VI_0) received an invalid passwd!
Jan 22 05:14:15 np0005591762 ceph-43df7a30-cf5f-5209-adfd-bf44298b19f2-keepalived-nfs-cephfs-compute-2-bromuh[88579]: Thu Jan 22 10:14:15 2026: (VI_0) received an invalid passwd!
